[Q] Which GPU is in the SM-T210R? - Galaxy Tab 3 Q&A, Help & Troubleshooting

I hope this is the right forum? I've been looking for the correct GPU on the SM-T210R Wi-Fi only Tab? I've done a few searches and it comes up PXA986 1.2 GHz dual-core with the PowerVR SGX540 ??? Then it goes off and says a totally different CPU/GPU combo.. The reason is I'm trying to get a game and it has 4 different gpu's zips from Mali, Tegra, Adreno and PoverVR and I'm trying to locate the correct one.. Any help Thanks!
And RocketTab 3.1 for the WIN! (LOL)
Jami:laugh:

.
T210/R and T211 use Vivante GC1000 core gpu

hamed... said:
T210/R and T211 use Vivante GC1000 core gpu
Click to expand...
Click to collapse
So what digging I did that is aka PowerVR? There is 4 different Injustice Gods Among us files Adreno, Mali, PowerVR and Tegra. I know it's not Tegra that's Nvidia and some Samsung uses Mali and Adreno and the other PowerVR has a / Vivante GC1000 ... So thanks I'll give the PowerVR a go and hope it all works after all the downloading lol Thanks!!
Jami:good:

jkdillon627 said:
So what digging I did that is aka PowerVR? There is 4 different Injustice Gods Among us files Adreno, Mali, PowerVR and Tegra. I know it's not Tegra that's Nvidia and some Samsung uses Mali and Adreno and the other PowerVR has a / Vivante GC1000 ... So thanks I'll give the PowerVR a go and hope it all works after all the downloading lol Thanks!!
Jami:good:
Click to expand...
Click to collapse
Good luck:good:

hi
jkdillon627 said:
I hope this is the right forum? I've been looking for the correct GPU on the SM-T210R Wi-Fi only Tab? I've done a few searches and it comes up PXA986 1.2 GHz dual-core with the PowerVR SGX540 ??? Then it goes off and says a totally different CPU/GPU combo.. The reason is I'm trying to get a game and it has 4 different gpu's zips from Mali, Tegra, Adreno and PoverVR and I'm trying to locate the correct one.. Any help Thanks!
And RocketTab 3.1 for the WIN! (LOL)
Jami:laugh:
Click to expand...
Click to collapse
Did it work for you ? i am trying to install real racing 3 and struck with the same reason.

Related

[Q] Anything that shows off DHD GPU?

I keep hearing that the Desire HD has a very good ATi GPU but so far all the 3D Games i've played do not look very good and have no Anti-Aliasing. I saw a 3D game on the iPhone 4 that was using the Unreal Engine and it looked amazing for a phone.
So are there any great games that look really good and show off the Desire HD GPU to its full potential?
Also I'm wondering if its possible to force Anti-Aliasing on for games that do not have it enabled? (I.e. Need for Speed: Shift)
LoneThread said:
I keep hearing that the Desire HD has a very good ATi GPU but so far all the 3D Games i've played do not look very good and have no Anti-Aliasing. I saw a 3D game on the iPhone 4 that was using the Unreal Engine and it looked amazing for a phone.
So are there any great games that look really good and show off the Desire HD GPU to its full potential?
Also I'm wondering if its possible to force Anti-Aliasing on for games that do not have it enabled? (I.e. Need for Speed: Shift)
Click to expand...
Click to collapse
I don't know if we can force the AA to always on, but this is an open source system, i'm sure someone will able to hack the GPU driver..
For games that really push the gpu, try playing gameloft Modern Combat 2 - Black Pegasus or even nova.. Black pegasus really blows me away..
Em, for iphone4, it's not that great actually, even angry birds looks jaggier than on my dhd.. At least that's why i'm comparing to my friends Iphone4, and he even amaze with the graphic quality on my dhd..
LoneThread said:
I keep hearing that the Desire HD has a very good ATi GPU but so far all the 3D Games i've played do not look very good and have no Anti-Aliasing. I saw a 3D game on the iPhone 4 that was using the Unreal Engine and it looked amazing for a phone.
So are there any great games that look really good and show off the Desire HD GPU to its full potential?
Also I'm wondering if its possible to force Anti-Aliasing on for games that do not have it enabled? (I.e. Need for Speed: Shift)
Click to expand...
Click to collapse
i dont know who told you the DHD had an ATI GPU but your mistaken, the DHD dosent even have dedicated graphics, (like the iphone and galaxy s) it uses the qualcomm adreno 205 GPU that is part of the snapdragon chipset adreno does support AA and you can use it in apps like PSX4Droid V2 and such, also try dungeon defenders, its also a game that used the unreal engine, and defo shows off adreno to its max
http://www.youtube.com/watch?v=tM_3QG4U63I
Adreno is built by AMD/ATI and is indeed a part of snapdragon, but GPU and CPU are still seperated. So adreno is a dedicated gpu.
Sent from my Desire HD using XDA App
****head said:
Adreno is built by AMD/ATI and is indeed a part of snapdragon, but GPU and CPU are still seperated. So adreno is a dedicated gpu.
Sent from my Desire HD using XDA App
Click to expand...
Click to collapse
^^^
/ / /
He is right!
Although in fact,Adreno was ATI until Qualcomm bought that department!
tolis626 said:
^^^
/ / /
He is right!
Although in fact,Adreno was ATI until Qualcomm bought that department!
Click to expand...
Click to collapse
I didn't know that part
But amd still provides support to adreno or does qualcomm do that now? If Q does, than it's better to call adreno a Qnchip instead of amd
Sent from my Desire HD using XDA App
tolis626 said:
^^^
/ / /
He is right!
Although in fact,Adreno was ATI until Qualcomm bought that department!
Click to expand...
Click to collapse
****head said:
I didn't know that part
But amd still provides support to adreno or does qualcomm do that now? If Q does, than it's better to call adreno a Qnchip instead of amd
Sent from my Desire HD using XDA App
Click to expand...
Click to collapse
regardless of who started the adreno project, it is now owned and made by qualcomm, it was nintendo who first started the "playstation" project, with sony only as advisers but its not the nintendo playstation now isit anyway as for dedicated graphics, the term "dedicated" refers to the gpu having dedicated vram, adreno uses shared vram therefore is not a dedicated graphics processing unit
****head said:
I didn't know that part
But amd still provides support to adreno or does qualcomm do that now? If Q does, than it's better to call adreno a Qnchip instead of amd
Sent from my Desire HD using XDA App
Click to expand...
Click to collapse
Well,don't know about that!I suppose yes,Adreno(formerly Imageon) was a part of AMD's graphics department.I have no idea whether it was before or after AMD bought ATI though...

[Q] FPU/VFP coprocessor for Desire HD?

Im wondering whether the MSM8255 Snapdragon utilizes any hardware for floating point calculation? And also, is there a datasheet/manual for the Scorpion core CPU? I cant find anything while googling...
tiisch said:
Im wondering whether the MSM8255 Snapdragon utilizes any hardware for floating point calculation? And also, is there a datasheet/manual for the Scorpion core CPU? I cant find anything while googling...
Click to expand...
Click to collapse
http://infocenter.arm.com/help/index.jsp?topic=/com.arm.doc.ddi0344k/Chdjbice.html
Hm - Thx for your post, but link pointing to cortex, while MSM8255 is arm7
Datasheet anyone?
BR

[Q] Tegra 3 VS Mali-400 ?

Hi
Which is better? Tegra3 or Mali 400
I don't know mate, this is what my phone after the update is capable of now.
Sent from my HTC One X using xda premium
Well it will be a race for sure
Mali might be faster (or maybe not), but Tegra 3 definitely better. Because it has better, enhanced games. Developers develop for Tegra. They don't develop for Mali or Adreno.
One guy complained that Shadowgun looks better on my phone than on his iPad3 - I had to explain that I'm running THD version, that we have those Tegra enhanced games. That makes a difference.
Tegra 3 will run all games. Adreno/Mali will require Chainfire3D with plugins to run Tegra games.
Thats my view on that.
The Mali 400 is old now, it`s not what the sg3 is getting surely.
John.
Even if SGS3 will get Mali T-604, I will stick with Tegra 3 for now. Unless I see games dedicated for T-604, and more than just one.
more...
http://www.engadget.com/2012/04/20/galaxy-s-iii-leak/
according to this it will have the 400
antipesto93 said:
http://www.engadget.com/2012/04/20/galaxy-s-iii-leak/
according to this it will have the 400
Click to expand...
Click to collapse
Didn't notice it mention 400. But if true people would find it disappointing, even that 400 is still serious piece of hardware. Given 720p screen, performance would be worse compared to SGS2.
The Mali's performance is the same as the Tegra 3's in graphics benchmarks I've done on my Note Vs my Prime and my One X (just goes to show how average the Tegra 3 GPU really is I think, no better than something at least 6 months older). Disappointing it's not the upgraded GPU if that is accurate, but doesn't differentiate the products at all.
Tinderbox (UK) said:
The Mali 400 is old now
Click to expand...
Click to collapse
Technically, so is Teg3. S4 uses 28nm and the 4212 uses 32nm. Teg3 is two 45nm A9 chips glommed together because Nvidia wanted to be first to market with a next-gen chip. It's the least advanced of any of the three SoCs. From a GPU perspective none of the three really move the ball forward and are just evolutionary vs. revolutionary. If I had to guess best overall performance I’d say 4212, Teg3, and S4 in that order. Because S4 and the 4212 are on smaller dies they’ll be more efficient and handily beat Teg3 at battery life (except maybe at idle).
delete post.
BarryH_GEG said:
Technically, so is Teg3. S4 uses 28nm and the 4212 uses 32nm. Teg3 is two 45nm A9 chips glommed together because Nvidia wanted to be first to market with a next-gen chip. It's the least advanced of any of the three SoCs. From a GPU perspective none of the three really move the ball forward and are just evolutionary vs. revolutionary. If I had to guess best overall performance I’d say 4212, Teg3, and S4 in that order. Because S4 and the 4212 are on smaller dies they’ll be more efficient and handily beat Teg3 at battery life (except maybe at idle).
Click to expand...
Click to collapse
tegra3 is actually made on the 40nm. nvidia still has tsmc's 40nm process and is migrating towards 28nm with desktop GPUs and will eventually migrate to 28nm with the tegra3+.
i hate how people always say that its a bad thing that apple didn`t upgrade the gpu but fust added more cores or samsung didn`t change the mali 400 gpu. the fact is that the mali and sgx543mp2 were ahead when they were released. now there is actual competition like the adreno 320 and tegra 3/4. a simple overclocked sgx or mali chip is enough to keep up with the competition.
NZtechfreak said:
The Mali's performance is the same as the Tegra 3's in graphics benchmarks I've done on my Note Vs my Prime and my One X (just goes to show how average the Tegra 3 GPU really is I think, no better than something at least 6 months older). Disappointing it's not the upgraded GPU if that is accurate, but doesn't differentiate the products at all.
Click to expand...
Click to collapse
Mali 400/450 is a 2nd generation GPU like tegra 2, only 44 millions polygons/sec, My Adreno 205 is 41 millions & The Tegra 3 is 129 millions.
Gameloft games in the end of 2012 will need 100 millions...
The Mali 3rd generation is Mali T-604/640 & Mali say that's it is 500% the performances of previous Mali GPU's :
http://www.arm.com/products/multimedia/mali-graphics-hardware/mali-t604.php
500% is using quad-core optimised applis (only tegra 3 will have it in less than 2 years) but it's 250% in dual-core...
As Tegra 3 is equal to T-604, Mali 400 is pawned...
-1st gen (Adreno 200, mali 200/300, SGX Power VR 520/530 & tegra 1)
-2nd gen (Adreno 205, Mali 400MP/450MP, SGX Power VR 540/554 & tegra 2)
-3rd gen (Adreno 220/225/320, Mali T604/640, SGX Power VR G 6200/6430 & Tegra 3)
Sekhen said:
Mali 400/450 is a 2nd generation GPU like tegra 2, only 44 millions polygons/sec, My Adreno 205 is 41 millions & The Tegra 3 is 129 millions.
Gameloft games in the end of 2012 will need 100 millions...
The Mali 3rd generation is Mali T-604/640 & Mali say that's it is 500% the performances of previous Mali GPU's :
http://www.arm.com/products/multimedia/mali-graphics-hardware/mali-t604.php
500% is using quad-core optimised applis (only tegra 3 will have it in less than 2 years) but it's 250% in dual-core...
As Tegra 3 is equal to T-604, Mali 400 is pawned...
-1st gen (Adreno 200, mali 200/300, SGX Power VR 520/530 & tegra 1)
-2nd gen (Adreno 205, Mali 400MP/450MP, SGX Power VR 540/554 & tegra 2)
-3rd gen (Adreno 220/225/320, Mali T604/640, SGX Power VR G 6200/6430 & Tegra 3)
Click to expand...
Click to collapse
link to your numbers about Tegra 3?
I have not used any device with Mali 400. Sorry mate~~
I think that tegra 3 is better but we have to attend the 3.x kernel to solve the battery problem properly.
Sent from my HTC One X using xda app-developers app
Mali-400 is good and strong and Tegra 3 might not be the fastest one there is, but it's the only one that gets best looking games. On top of that, Tegra 3 Plus is coming soon and then next year another one with direct x and supposed console-like performance. See what Nvidia does for desktops and just hope they keep the pace with mobile GPU and we will get there too. I don't really consider non-tegra device unless it amazes me with noticeably better power efficiency or optimized games start coming out for it.
Would you buy non-nvidia and non-ati graphics card for your pc?
schriss said:
Mali-400 is good and strong and Tegra 3 might not be the fastest one there is, but it's the only one that gets best looking games. On top of that, Tegra 3 Plus is coming soon and then next year another one with direct x and supposed console-like performance. See what Nvidia does for desktops and just hope they keep the pace with mobile GPU and we will get there too. I don't really consider non-tegra device unless it amazes me with noticeably better power efficiency or optimized games start coming out for it.
Would you buy non-nvidia and non-ati graphics card for your pc?
Click to expand...
Click to collapse
Exactly!
Given the choice, I would buy a Tegra device over anything else.

[Q] Help Overclocking SM-P600 ARM A15 cores: is it possible?

Ive gotten apps that can over clock the cores, but it's only the 4 SMALL cores which clock at 1.3 GHZ and can over clock into 1.9 GHZ but I don't need that since the Exynos 5 Octa has 4 OTHER cores that clock at 1.9GHZ. The Dolphin emulator doesn't use the 1.9GHZ cores. When I over clock there 1.3GHZ to 1.9GHZ while playing Super Smash Bros Melee, it runs just as good as the Note 3 does, which is 2.3GHZ SnapDragon. If I were to use AND over clock the 1.9GHZ cores, I may be able to run that and also other graphic intense games almost perfectly. Help please.
NerroEx said:
Ive gotten apps that can over clock the cores, but it's only the 4 SMALL cores which clock at 1.3 GHZ and can over clock into 1.9 GHZ but I don't need that since the Exynos 5 Octa has 4 OTHER cores that clock at 1.9GHZ. The Dolphin emulator doesn't use the 1.9GHZ cores. When I over clock there 1.3GHZ to 1.9GHZ while playing Super Smash Bros Melee, it runs just as good as the Note 3 does, which is 2.3GHZ SnapDragon. If I were to use AND over clock the 1.9GHZ cores, I may be able to run that and also other graphic intense games almost perfectly. Help please.
Click to expand...
Click to collapse
Overclocking was attempted with the Bindroid Kernel but it would not stick. As for it running better on the Note 3, you are pushing a lot fewer pixels on the Note 3 (1200x1980) versus Note 2014 (2560x1600)
nrage23 said:
Overclocking was attempted with the Bindroid Kernel but it would not stick. As for it running better on the Note 3, you are pushing a lot fewer pixels on the Note 3 (1200x1980) versus Note 2014 (2560x1600)
Click to expand...
Click to collapse
what's a bindroid kernel lol and well then shouldn't that mean that it should be harder to run on the tablet??? and also if I were to over clock the bigger cores and over clock the gpu, which so far the apps that over clock gpu's are only for Snapdargon, then it would run better than any device on the market.
Also, I found this: "The core switching is controlled by a firmware layer that sits in between the software and the chip itself. Operating systems can be tweaked to better support big.LITTLE's particular arrangement of cores, but any OS that supports power state switching for CPUs (any mainstream operating system from the last decade or so) can take advantage of big.LITTLE without any additional changes."
Source:http://www.wired.co.uk/news/archive/2013-03/19/exynos-5-octa
Though this is for the galaxy s4, it should be the architecture as the note 10.1 2014. But is it possible to tweak that firmware or even MANUALLY activate the 4 stronger cores at will???
NerroEx said:
what's a bindroid kernel lol and well then shouldn't that mean that it should be harder to run on the tablet??? and also if I were to over clock the bigger cores and over clock the gpu, which so far the apps that over clock gpu's are only for Snapdargon, then it would run better than any device on the market.
Also, I found this: "The core switching is controlled by a firmware layer that sits in between the software and the chip itself. Operating systems can be tweaked to better support big.LITTLE's particular arrangement of cores, but any OS that supports power state switching for CPUs (any mainstream operating system from the last decade or so) can take advantage of big.LITTLE without any additional changes."
Source:http://www.wired.co.uk/news/archive/2013-03/19/exynos-5-octa
Though this is for the galaxy s4, it should be the architecture as the note 10.1 2014. But is it possible to tweak that firmware or even MANUALLY activate the 4 stronger cores at will???
Click to expand...
Click to collapse
You may want to do some more research before you do anything. Most aspects of the CPU or GPU are handled by the kernel. The only way you can overclock any device is with a custom kernel that has the higher frequencies in the tables. DutchDanny tried to get overclocking working but it did not work. You can however underclock most any device since it has all the lower frequencies listed. If we wanted to enable Big Little on the Note 2014 we would need Samsung to release source code enabling it. Which I am sure they will not do. Again the Big Little is kernel dependent. The Android OS is not the problem Samsung is the problem. Do a search there is a very good thread about the 8 core thing.
nrage23 said:
You may want to do some more research before you do anything. Most aspects of the CPU or GPU are handled by the kernel. The only way you can overclock any device is with a custom kernel that has the higher frequencies in the tables. DutchDanny tried to get overclocking working but it did not work. You can however underclock most any device since it has all the lower frequencies listed. If we wanted to enable Big Little on the Note 2014 we would need Samsung to release source code enabling it. Which I am sure they will not do. Again the Big Little is kernel dependent. The Android OS is not the problem Samsung is the problem. Do a search there is a very good thread about the 8 core thing.
Click to expand...
Click to collapse
Wait wuuut I heard there was going to be an update for the Exynos 5 octa for Big.Little Architecture ???
NerroEx said:
Wait wuuut I heard there was going to be an update for the Exynos 5 octa for Big.Little Architecture ???
Click to expand...
Click to collapse
The Exynos 5 is fully capable but Samsung has never said they would update the Note 2014 to enable it.
Sent from my HTC6600LVW using XDA Premium HD app
nrage23 said:
The Exynos 5 is fully capable but Samsung has never said they would update the Note 2014 to enable it.
Sent from my HTC6600LVW using XDA Premium HD app
Click to expand...
Click to collapse
Actually I heard either in Q4 or Q3 they were going to release an update
NerroEx said:
Actually I heard either in Q4 or Q3 they were going to release an update
Click to expand...
Click to collapse
I got the Note 2014 the day it came out and I check all kinds of news/forums everyday. I have seen nothing where they stated they would update any current devices with big.LITTLE. They have stated by the end of the year they would release new devices with big.LITTLE implemented. I would guess the first ones would be the 6 core midrange processor due to heat and power usage. You can get a lot more information from this thread in the main section.
http://forum.xda-developers.com/showthread.php?t=2645875

[Q] Is the MediaTek MT6795T in M9+ Better Than SD 810?

What about PowerVR 6200 GPU vs Adreno 430?
A good question!
What's actually inside these processors...
Can't post outside link... Its (8) a53 processors clocked at 2.0ghz w/ powervr 6200
The 810 being (4)a53/(4)a57 big.LITTLE combo.
Long story short geekbench says the mediatek wins in multicore barely, but is smashed in single core, because it's a true octacore, but just a midrange one severely overclocked, with last generations gpu running the blinky flashy show.
this is all based on mt6795 not sure what the (t) means...
atomikpunx said:
What's actually inside these processors...
Can't post outside link... Its (8) a53 processors clocked at 2.0ghz w/ powervr 6200
The 810 being (4)a53/(4)a57 big.LITTLE combo.
Long story short geekbench says the mediatek wins in multicore barely, but is smashed in single core, because it's a true octacore, but just a midrange one severely overclocked, with last generations gpu running the blinky flashy show.
this is all based on mt6795 not sure what the (t) means...
Click to expand...
Click to collapse
So, what does this mean in real world use?
Sharpshooterrr said:
So, what does this mean in real world use?
Click to expand...
Click to collapse
Last mediatek proccesors trully provide smoothness, the MTK6795T is just a flashy overclocked MTK6795, yes, it beats the SD810, even the snapdragron 805 beats the 810 in some devices, maybe is because the 810 throttles himself to the oblivion.
MediaTek SOC's are known to have crap embedded security.
Additionally if you think about the ways in which the big.LITTLE architecture works it makes a lot more logical sense than a makeup of 8 cores in a true octa-core setup.
M9+ is out, so we'll see

Categories

Resources