So with the new Dual Core phones coming out I'm wondering... What's all the hullabaloo?
I just finished reading the Moto Atrix review from Engadget and it sounds like crap. They said docking to the ridiculously priced webtop accessory was slow as shiz.
Anyone who knows better, please educate me. I'd like to know what is or will be offered that Dual Core will be capable of that our current gen phones will NOT be capable of.
For one thing (my main interest anyway) dual core cpu's and beyond give us better battery life. If we end up having more data intensive apps and Android becomes more powerful multi-core cpu's will help a lot also. Naturally Android will need to be broken down and revamped to utilize multiple cores to their full potential though. At some point I can see Google using more or merging a large part of the desktop linux kernel to help with that process.
At the rate Android (and smart phones in general) is progressing, someday we may see a 64bit OS on a phone, we will definitely need multi-core cpu's then. I know, it's a bit of a dream but it's probably not too elaborate.
KCRic said:
For one thing (my main interest anyway) dual core cpu's and beyond give us better battery life.
Click to expand...
Click to collapse
I'd really, REALLY like to know how you came to that particular conclusion. While a dual core might not eat through quite as much wattage as two single cores, one that takes less is pure snakeoil IMO. I have yet to see a dual core CPU that is rated lower than a comparable single core on the desktop. Why would this be different for phones?
Software and OSes that can handle a dual core CPU need additional CPU cycles to manage the threading this results in, so if anything, dual core CPUs will greatly, GREATLY diminish battery life.
The original posters question is valid. What the heck would one need dual core CPUs in phones for? Personally, I can't think of anything. Running several apps in parallel was a piece of cake way before dual CPUs and more power can easily be obtained through increasing the clock speed.
I'm not saying my parent poster is wrong, but I sure as heck can't imagine the physics behind his statement. So if I'm wrong, someone please enlighten me.
I can see dual cores offering a smoother user experience -- one core could be handling an audio stream while the other is doing phone crap. I don't see how it could improve battery life though....
The theory is that two cores can accomplish the same thing as a single core while only working half as hard, I've seen several articles stating that dual cores will help battery life. Whether that is true I don't know.
Sent from my T-Mobile G2 using XDA App
Kokuyo, while you do have a point about dual cores being overkill in a phone I remember long ago people saying "why would you ever need 2gb of RAM in a PC" or "who could ever fill up a 1tb hard drive."
Thing is wouldnt the apps themselves have to be made to take advantage of dual cores as well?
JBunch1228; The short-term answer is nothing. Same answer as the average joe asking what he needs a quad-core in his desktop for. Right now it seems as much a sales gimmick as anything else, since the only Android ver that can actually make use of it is HC. Kinda like the 4G bandwagon everyone jumped on, all marketing right now.
Personally, I;d like to se what happens with the paradigm the Atrix is bringing out in a year or so. Put linux on a decent sized SSD for the laptop component, and use the handset for processing and communications exclusivley, rather than try and use the 'laptop dock' as nothing more than an external keyboard
As far as battery life, I can see how dual-cores could affect it positively, as a dual core doesnt pull as much power as two individual cores, and, if the chip is running for half as long as a single core would for the same operation, that would give you better batt life. Everyone keep in mind I said *if*. I don't see that happening before Q4, since the OS and apps need to be optimized for it.
My $.02 before depreciation.
Then there are the rumors of mobile quad-cores from Nvidia by Q4 as well. I'll keep my single core Vision, and see whats out there when my contract ends. We may have a whole new world.
KCRic said:
For one thing (my main interest anyway) dual core cpu's and beyond give us better battery life. If we end up having more data intensive apps and Android becomes more powerful multi-core cpu's will help a lot also. Naturally Android will need to be broken down and revamped to utilize multiple cores to their full potential though. At some point I can see Google using more or merging a large part of the desktop linux kernel to help with that process.
Click to expand...
Click to collapse
Wow, that's complete nonsense.
You can't add parts and end up using less power.
Also, Android needs no additional work to support multiple cores. Android runs on the LINUX KERNEL, which is ***THE*** choice for multi-core/multi-processor supercomputers. Android applications run each in their own process, the linux kernel then takes over process swapping. Android applications also are *already* multi-threaded (unless the specific application developer was a total newb).
At the rate Android (and smart phones in general) is progressing, someday we may see a 64bit OS on a phone, we will definitely need multi-core cpu's then. I know, it's a bit of a dream but it's probably not too elaborate.
Click to expand...
Click to collapse
What's the connection? Just because the desktop processor manufacturers went multi-core and 64bit at roughly the same time doesn't mean that the two are even *slightly* related. Use of a 64bit OS on a phone certainly does ***NOT*** somehow require that the processor be multi-core.
dhkr234 said:
Wow, that's complete nonsense.
You can't add parts and end up using less power.
Also, Android needs no additional work to support multiple cores. Android runs on the LINUX KERNEL, which is ***THE*** choice for multi-core/multi-processor supercomputers. Android applications run each in their own process, the linux kernel then takes over process swapping. Android applications also are *already* multi-threaded (unless the specific application developer was a total newb).
What's the connection? Just because the desktop processor manufacturers went multi-core and 64bit at roughly the same time doesn't mean that the two are even *slightly* related. Use of a 64bit OS on a phone certainly does ***NOT*** somehow require that the processor be multi-core.
Click to expand...
Click to collapse
The connection lies in the fact that this is technology we're talking about. It continually advances and does is at a rapid rate. No where in it did I say we'll make that jump 'at the same time'. Linux is not ***THE*** choice for multi-core computers, I use Sabayon but also Win7 seems to do just fine with multiple cores. Android doesn't utilize multi-core processors to their full potential and also uses a modified version of the linux kernel (which does fully support multi-core systems), that's whay I made the statement about merging. Being linux and being based on linux are not the same thing. Think of iOS or OSX - based on linux but tell me, how often do linux instuctions work for a Mac?
"you can't add parts and use less power", the car industry would like you clarify that, along with the computer industry. 10 years ago how much energy did electronics use? Was the speed and power vs. power consumption ratio better than it is today? No? I'll try to give an example that hopefully explains why consumes less power.
Pizza=data
People=processors
Time=heat and power consumption
1 person takes 20 minutes to eat 1 whole pizza while 4 people take only 5 minutes. That one person is going to have to work harder and longer in order to complete the same task as the 4 people. That will use more energy and generate much more heat. Heat, as we know, causes processors to become less efficient which means more energy is wasted at the higher clock cycles and less information processed per cycle.
It's not a very technical explanation of why a true multi-core system uses less power but it will have to do. Maybe ask NVidia too since they stated the Tegra processors are more power efficient.
KCRic said:
The connection lies in the fact that this is technology we're talking about. It continually advances and does is at a rapid rate. No where in it did I say we'll make that jump 'at the same time'. Linux is not ***THE*** choice for multi-core computers, I use Sabayon but also Win7 seems to do just fine with multiple cores.
Click to expand...
Click to collapse
Show me ***ONE*** supercomputer that runs wondoze. I DARE YOU! They don't exist!
Android doesn't utilize multi-core processors to their full potential and also uses a modified version of the linux kernel (which does fully support multi-core systems), that's whay I made the statement about merging. Being linux and being based on linux are not the same thing.
Click to expand...
Click to collapse
??? No, being LINUX and GNU/LINUX are not the same. ANDROID ***IS*** LINUX, but not GNU/LINUX. The kernel is the kernel. The modifications? Have nothing to do with ANYTHING this thread touches on. The kernel is FAR too complex for Android to have caused any drastic changes.
Think of iOS or OSX - based on linux but tell me, how often do linux instuctions work for a Mac?
Click to expand...
Click to collapse
No. Fruitcakes does NOT use LINUX ***AT ALL***. They use MACH. A *TOTALLY DIFFERENT* kernel.
"you can't add parts and use less power", the car industry would like you clarify that, along with the computer industry. 10 years ago how much energy did electronics use? Was the speed and power vs. power consumption ratio better than it is today? No? I'll try to give an example that hopefully explains why consumes less power.
Click to expand...
Click to collapse
Those changes are NOT RELATED to adding cores, but making transistors SMALLER.
Pizza=data
People=processors
Time=heat and power consumption
1 person takes 20 minutes to eat 1 whole pizza while 4 people take only 5 minutes. That one person is going to have to work harder and longer in order to complete the same task as the 4 people. That will use more energy and generate much more heat. Heat, as we know, causes processors to become less efficient which means more energy is wasted at the higher clock cycles and less information processed per cycle.
It's not a very technical explanation of why a true multi-core system uses less power but it will have to do. Maybe ask NVidia too since they stated the Tegra processors are more power efficient.
Click to expand...
Click to collapse
You have come up with a whole lot of nonsense that has ABSOLUTELY NO relation to multiple cores.
Energy consumption is related to CPU TIME.
You take a program that takes 10 minutes of CPU time to execute on a single-core 3GHz processor, split it between TWO otherwise identical cores operating at the SAME FREQUENCY, add in some overhead to split it between two cores, and you have 6 minutes of CPU time on TWO cores, which is 20% *MORE* energy consumed on a dual-core processor.
And you want to know what NVIDIA will say about their bloatchips? It uses less power than *THEIR* older hardware because it has **SMALLER TRANSISTORS** that require less energy.
Don't quite your day job, computer engineering is NOT YOUR FORTE.
dhkr234 said:
Show me ***ONE*** supercomputer that runs wondoze. I DARE YOU! They don't exist!
??? No, being LINUX and GNU/LINUX are not the same. ANDROID ***IS*** LINUX, but not GNU/LINUX. The kernel is the kernel. The modifications? Have nothing to do with ANYTHING this thread touches on. The kernel is FAR too complex for Android to have caused any drastic changes.
No. Fruitcakes does NOT use LINUX ***AT ALL***. They use MACH. A *TOTALLY DIFFERENT* kernel.
Those changes are NOT RELATED to adding cores, but making transistors SMALLER.
You have come up with a whole lot of nonsense that has ABSOLUTELY NO relation to multiple cores.
Energy consumption is related to CPU TIME.
You take a program that takes 10 minutes of CPU time to execute on a single-core 3GHz processor, split it between TWO otherwise identical cores operating at the SAME FREQUENCY, add in some overhead to split it between two cores, and you have 6 minutes of CPU time on TWO cores, which is 20% *MORE* energy consumed on a dual-core processor.
And you want to know what NVIDIA will say about their bloatchips? It uses less power than *THEIR* older hardware because it has **SMALLER TRANSISTORS** that require less energy.
Don't quite your day job, computer engineering is NOT YOUR FORTE.
Click to expand...
Click to collapse
If you think that its just a gimmick or trend then why does every laptop manufacturer use dual core or more and have better battery life than the old single core? Sometimes trends do have more use than aesthetic appeal. Your know-it-all approach is nothing new around here and you're not the only person who works in IT around. Theories are one thing but without any proof when ALL current tech says otherwise... makes you sound like a idiot. Sorry...
I bet I can pee further
Sent from my HTC Vision using XDA App
zaelia said:
I bet I can pee further
Sent from my HTC Vision using XDA App
Click to expand...
Click to collapse
The smaller ones usually can, I think it has to do with the urethra being more narrow as to allow a tighter, further shooting stream.
Sent from my HTC Glacier using XDA App
TJBunch1228 said:
The smaller ones usually can, I think it has to do with the urethra being more narrow as to allow a tighter, further shooting stream.
Sent from my HTC Glacier using XDA App
Click to expand...
Click to collapse
Well, you would know
sino8r said:
Well, you would know
Click to expand...
Click to collapse
It might be short but it sure is skinny.
Sent from my HTC Glacier using XDA App
sino8r said:
If you think that its just a gimmick or trend then why does every laptop manufacturer use dual core or more and have better battery life than the old single core? Sometimes trends do have more use than aesthetic appeal. Your know-it-all approach is nothing new around here and you're not the only person who works in IT around. Theories are one thing but without any proof when ALL current tech says otherwise... makes you sound like a idiot. Sorry...
Click to expand...
Click to collapse
+1
I was comparing speeds on the Atrix compared to the [email protected] and they matched. The Atrix was much more efficient on heat and probably with battery. The dual cores will use less power because the two cores will be better optimized for splitting the tasks and will use half the power running the same process as the single core because the single core runs at the same voltages for a single core compared to splitting it between two. Let's not start a flame war and make personal attacks on people
Sent from my HTC Vision with Habanero FAST 1.1.0
It is disturbing that there are people out there who can't understand this VERY BASIC engineering.
Voltage, by itself, has NO MEANING. You are forgetting about CURRENT. POWER = CURRENT x VOLTAGE.
Battery drain is DIRECTLY PROPORTIONAL to POWER. Not voltage. Double the voltage and half the current, power remains the same.
Dual core does NOT increase battery life. It increases PERFORMANCE by ***DOUBLING*** the physical processing units.
Battery life is increased through MINIATURIZATION and SIMPLIFICATION, which becomes *EXTREMELY* important as you increase the number of physical processing units.
It is the epitome of IGNORANCE to assume that there is some relation when there is not. The use of multiple cores relates to hard physical limitations of the silicon. You can't run the silicon at 18 GHz! Instead of racing for higher frequencies, the new competition is about how much work you can do with the SAME frequency, and the ***EASIEST*** way to do this is to bolt on more cores!
For arguments sake, take a look at a couple of processors;
Athlon II X2 240e / C3.... 45 watt TDP, 45 nm
Athlon II X4 630 / C3.... 95 watt TDP, 45 nm
Same stepping, same frequency (2.8 GHz), same voltage, same size, and the one with twice the cores eats more than twice the power. Wow, imagine that!
The X4 is, of course, FASTER, but not by double.
Now lets look at another pair of processors;
Athlon 64 X2 3800+ / E6.... 89 watt TDP, 90 nm
Athlon II X2 270u / C3.... 25 watt TDP, 45 nm
Different stepping, SAME frequency (2.0 GHz), same number of cores, different voltage, different SIZE, WAY different power consumption. JUST LOOK how much more power the older chip eats!!! 3.56 times as much. Also note that other power management features exist on the C3 that didn't exist on the E6, so the difference in MINIMUM power consumption is much greater.
Conclusion: There is no correlation between a reduction in power consumption and an increase in the number of PPUs. More PPUs = more performance. Reduction in power consumption is related to size, voltage, and other characteristics.
dhkr234 said:
Don't quite your day job, computer engineering is NOT YOUR FORTE.
Click to expand...
Click to collapse
Good job on being a douche. I didn't insult you in anything I said and if you disagree over my perspective then state it otherwise shut up. I didn't tell you english grammar isn't your forte so maybe you should keep your senile remarks to yourself.
You seem to want to argue over a few technicalities and I'll admit, I don't have a PhD in computer engineering but then again I doubt you do either. For the average person to begin to understand the inner-workings of a computer requires you to set aside the technical details and generalize everything. When they read about a Mac, they will see the word Unix which also happens to appear in things written about Linux and would inevitably make a connection about both being based off of the same thing (which they are). In that sense, I'm correct - you're wrong. The average person doesn't differentiate between 'is' and 'based off', most people take them in the same context.
So I may be wrong in some things when you get technical but when you're talking to the average person that thinks the higher the CPU core clock is = the better the processor, you end up being wrong because they won't give a damn about the FSB or anything else. Also, when you start flaming people and jumping them over insignificant things you come off as a complete douche. If I'm wrong on something then tactfully and politely correct me - don't try to act like excerebrose know-it-all. Let's not even mention completely going off track about about Windoze, servers aren't the only things that have multi-core processors.
I'm sure you'll try to multi-quote me with a slew of unintelligent looking, lame comebacks and corrections but in the end you'll just prove my point about the type of person you are. ****The End****
KCRic said:
Good job on being a douche. I didn't insult you in anything I said and if you disagree over my perspective then state it otherwise shut up. I didn't tell you english grammar isn't your forte so maybe you should keep your senile remarks to yourself.
Click to expand...
Click to collapse
Agreeing or disagreeing is pointless when discussing FACTS. Perspective has nothing to do with FACTS. You can think whatever you like, but it doesn't make you right.
You seem to want to argue over a few technicalities and I'll admit, I don't have a PhD in computer engineering but then again I doubt you do either.
Click to expand...
Click to collapse
Common mistake, assuming that everybody is the same as you. Try not to make that assumption again.
For the average person to begin to understand the inner-workings of a computer requires you to set aside the technical details and generalize everything.
Click to expand...
Click to collapse
Generalizations lead to inaccuracies. You do not teach by generalizing, you teach by starting from the bottom and building a foundation of knowledge. Rene Descartes (aka Renatus Cartesius, as in Cartesian geometric system, as in the father of analytical geometry) said that the foundation of all knowledge is that doubting one's own existence is itself proof that there is someone to doubt it -- "Cogito Ergo Sum" -- "I think therefore I am". Everything must begin with this.
When they read about a Mac, they will see the word Unix which also happens to appear in things written about Linux and would inevitably make a connection about both being based off of the same thing (which they are). In that sense, I'm correct - you're wrong. The average person doesn't differentiate between 'is' and 'based off', most people take them in the same context.
Click to expand...
Click to collapse
... and need to be CORRECTED for it. The two kernels (the only components relevant to this discussion) are completely different! MACH is a MICRO kernel, Linux is a MONOLITHIC kernel. Superficial characteristics (which are OUTSIDE of the kernel) be damned, they are NOT the same thing and thinking that they are is invalid. The average person is irrelevant, FACTS are FACTS.
So I may be wrong in some things when you get technical but when you're talking to the average person that thinks the higher the CPU core clock is = the better the processor, you end up being wrong because they won't give a damn about the FSB or anything else.
Click to expand...
Click to collapse
So are you trying to tell me that IGNORANCE is BLISS? Because "giving a damn" or not has NO BEARING on reality. The sky is blue. You think that its purple and don't give a damn, does that make it purple? No, it does not.
Also, when you start flaming people and jumping them over insignificant things you come off as a complete douche. If I'm wrong on something then tactfully and politely correct me - don't try to act like excerebrose know-it-all. Let's not even mention completely going off track about about Windoze, servers aren't the only things that have multi-core processors.
Click to expand...
Click to collapse
Right, servers AREN'T the only thing running multi-core processors, but did you not read where I SPECIFICALLY said **SERVERS**? Wondoze is off track and UNRELATED. I brought up servers because THEY USE THE SAME KERNEL AS ANDROID. If a supercomputer uses Linux, do you not agree that Linux is CLEARLY capable of multiprocessing well enough to meet the needs of a simple phone?
I'm sure you'll try to multi-quote me with a slew of unintelligent looking, lame comebacks and corrections but in the end you'll just prove my point about the type of person you are. ****The End****
Click to expand...
Click to collapse
... perfectionist, intelligent, PATIENT in dealing with ignorance. And understand that ignorance is not an insult when it is true, and contrary to common "belief", does NOT mean stupid. Learn the facts and you will cease to be ignorant of them.
So hopefully this train can be put back on the tracks...
From what I am understanding from more technical minded individuals, Dual Core should help with battery life because it requires less power to run the same things as single core. It can then probably be extrapolated that when pushed, Dual Core will be able to go well above and beyond its Single Core brethren in terms of processing power.
For now, it appears the only obvious benefit will be increased battery life and less drain on the processor due to overworking. Hopefully in the near future more CPU and GPU intensive processes are introduced to the market which will fully utilize the Dual Core's potential in the smartphone world. Thanks for all the insight.
dhkr234 - *slaps air high-five*
Can anyone develop an app which can set CPU speed in WP7 phones???
probably.
Now, this would be kind of cool!! Especially if i can run my samsung focus at 1.4ghz
Overclocking by 40% would probably destroy your phone... just saying.
GoodDayToDie said:
Overclocking by 40% would probably destroy your phone... just saying.
Click to expand...
Click to collapse
I don't think so. My old Omnia HD (Symbian) can be overclocked from 600mhz to 900/950mhz (50% more) without problem, with an elder architecture...
I wish to see 2nd Gen. devices how much can be overclocked...1.8 Ghz will be great! But Wp is the faster OS yet =D
Older architectures are actually often better for overclocking, because they're not running as close to the theoretical limit on the speed of the chip (the practical limit is based on heat dissipation capability, but there are other limits that are more subtle and tend to just result in weird hardware errors rather than thermal shutoff). That said, 1GHz is still probably pretty far from the limit.
I wouldn't oc for my device just to watch the speed of the cpu or maybe just a little bit
I think this will be useful only for device with new cpu with 1ghz clock like radar
Thread Closed
Do not post questions in this section!
Hello,
I read lots of articles about how the Tegra 3 only scores well in benchmarks because of its 4 cores, which are overkill in almost all real world scenarios. So I was interested to find out how the the Tegra 3 would do if you made it a Dual Core, like the S4.
I ran Antutu three times, running stock everything (except root), and the lowest I got was 7114.
I know it is not very reliable to use one benchmark, but in my opinion, neither is using quadrant which is made by Qualcomm.
I find it interesting to see that the Tegra 3 scores considerably more than the S4, even with the same number of cores.
What are your thoughts? What do you think caused this? What does it mean?
Unable to upload screenshot:
Ram - 1202
CPU Integer - 2004
CPU Float - 1550
2D - 295
3D - 1242
Database - 475
SD Read - 150
SD Write - 196
not too sure why it would still win hands down *shrugs*
How does the real world speed with 2 cores disabled though?
And does it seem to save any battery if you've had it going for a while.
I've had it set to two cores only for a few days. It makes NO difference to anything but benchmark scores (even antutu still shows 60 FPS in the graphics tests). Games like Dark Meadow THD run exactly the same as before.
I'm not too sure how it has affected the battery life as I installed a mod that lowers the auto brightness at the same time. All I can say is the combination of the two has dramatically increased the life of the battery
So the GPU is bottleneck (surprise)
Which GPU are you referring it?
maybe the benchmark tests are yet to be fully optimised for 4 cores?
ORStoner said:
Hello,
I read lots of articles about how the Tegra 3 only scores well in benchmarks because of its 4 cores, which are overkill in almost all real world scenarios. So I was interested to find out how the the Tegra 3 would do if you made it a Dual Core, like the S4.
I ran Antutu three times, running stock everything (except root), and the lowest I got was 7114.
I know it is not very reliable to use one benchmark, but in my opinion, neither is using quadrant which is made by Qualcomm.
I find it interesting to see that the Tegra 3 scores considerably more than the S4, even with the same number of cores.
What are your thoughts? What do you think caused this? What does it mean?
Unable to upload screenshot:
Ram - 1202
CPU Integer - 2004
CPU Float - 1550
2D - 295
3D - 1242
Database - 475
SD Read - 150
SD Write - 196
Click to expand...
Click to collapse
My guess is the reason the Tegra 3 with 2 cores running scores lower than the Snapdragon S4 is because the Tegra 3 has 4 A9 cores, whereas the Snapdragon has 2 cores that are closer to the A15 architecture, which is a faster chip. A15 will be quicker than A9 if the same number of cores are being used in each chipset.
My One X scores around 11000 with four cores and, as you can see, 7114 with two cores. In just curious to know why with two cores it scores around 1000 more than the S4 version?
thegregulator said:
My guess is the reason the Tegra 3 with 2 cores running scores lower than the Snapdragon S4 is because the Tegra 3 has 4 A9 cores, whereas the Snapdragon has 2 cores that are closer to the A15 architecture, which is a faster chip. A15 will be quicker than A9 if the same number of cores are being used in each chipset.
Click to expand...
Click to collapse
I think you misunderstood, the Tegra with two A9 scored greater than the Snapdragon. If you were correct, I would not be as surprised as I am and would not have started this thread.
Out of curiosity what are you using to lock the 2 cores?
Wouldn't mind trying it out myself
Open root explorer.
Go to sys/kernel/debug/tegra_hotplug
Open max_cpus in text editor, change 4 to 2 or 3 (Single core does not work).
Open it again to check it has saved properly and it will go back to 4 the next time you reboot the phone.
You know Tegra 3 has 5 cores instead of 4 cores in the A9 architecture right? So you basically did the benchmark with 3 cores instead of 2.
Sent from my Incredible 2 using XDA
5th core is just a low clock speed / power to run idle tasks?
Doubt it would do much to a benchmark test.
Correct me if I'm wrong but the companion core can only be used on its own while the main processor is shut off. Even if it could 'assist' the main processor, it's only around 300mhz and would make very little difference to the score of a benchmark.
david_hume said:
You know Tegra 3 has 5 cores instead of 4 cores in the A9 architecture right? So you basically did the benchmark with 3 cores instead of 2.
Sent from my Incredible 2 using XDA
Click to expand...
Click to collapse
From what i know the 5th companion core is invisible to the system so 2 would be correct in the max_cpu edit.
I refer to my previous question which was why is the dual core tegra 3 doing BETTER than the dual core S4? Is it down to the GPU?
well that's revealing indeed, what's more interesting is how the included governors work
on-demand quad max 1400mhz conservative in jumping to max
interactive three cores only 1400mhz max jumps more often
performance three cores only locks at 1200mhz and jumps to 1400mhz on stress
glowball frame rate suffer badly when running less the 4 cores
you can see at default ondemand tegra3 is always juggling on 4 cores but rarely peaking to max clocks
While it indeed sucks as a useful tool, you should be aware that Quadrant is not a Qualcomm program...perhaps you're confusing it with the antiquated NeoCore benchmark. Vellamo is Qualcomm as well.
Sorry Vellano is Qualcomm not Quadrant. My mistake.
ORStoner said:
I refer to my previous question which was why is the dual core tegra 3 doing BETTER than the dual core S4? Is it down to the GPU?
Click to expand...
Click to collapse
If you are referring to the s4 in the htc one s then id like to know where you got your information indicating that the tegra is still faster in dual core when in quad the s4 still out performs it. The s4 scores over 12000 compared to 11500 that the tegra does and well over 7000 that it does in dual core mode....
The older a9 however found in older phones such as the sensation is another story with that chip scoring in about 6500
anyone got this overclocked
mox123 said:
anyone got this overclocked
Click to expand...
Click to collapse
And instantly overheated? :cyclops:
Yes .
Sent from my HTC One X using Tapatalk 2
Gpu overclock would be more useful than CPU.
Sent from my HTC One X using xda app-developers app
treebill said:
Gpu overclock would be more useful than CPU.
Sent from my HTC One X using xda app-developers app
Click to expand...
Click to collapse
ok gpu overclock then?
I would overclock my HOX...in a block of ice. Or...well, in real life i dont want to overclock it because it would smoke out in my hand
Overheating is a big problem even without overclocking, imagine it running on 1,6ghz...
Sent from my Renovated HTC One X using Tapatalk 2
Can't really see why you would want to overclock the One X, the phone is blazing fast anyway, 4 cores at 1.5 is enough..
But like everybody else said, the phone would probably burn up..
I wouldnt overclock my device - at least not at the stage we reached now.
Why?
a) As long as there is no way to lower the voltage this might toast your device - its a unibody, keep that in mind!
b) 100 MHZ more would have literally no effect - its a 6 GHZ device, even if you boost it up to 6,4 - you wont notice, it will just drain your battery.
6 GHZ is WAY enough...this is smartphone...I mean...seriously...its got more power than my 4 years old 1K €uro notebook...
Illux said:
I wouldnt overclock my device - at least not at the stage we reached now.
Why?
a) As long as there is no way to lower the voltage this might toast your device - its a unibody, keep that in mind!
b) 100 MHZ more would have literally no effect - its a 6 GHZ device, even if you boost it up to 6,4 - you wont notice, it will just drain your battery.
6 GHZ is WAY enough...this is smartphone...I mean...seriously...its got more power than my 4 years old 1K €uro notebook...
Click to expand...
Click to collapse
well first of all you can't just multiply the frequency by the number of cores. I'd much prefer an actual 6Ghz single core processor over 4x1.5Ghz because it won't have any compatibility and efficiency issues. Assuming they are of the same architecture and power usage of course.
Also the ARM low power SOCs probably don't have comparable number of commands per clock cycle as an x86 high performance CPU, even if it's 4 years old.
jacobgong said:
well first of all you can't just multiply the frequency by the number of cores. I'd much prefer an actual 6Ghz single core processor over 4x1.5Ghz because it won't have any compatibility and efficiency issues. Assuming they are of the same architecture and power usage of course.
Also the ARM low power SOCs probably don't have comparable number of commands per clock cycle as an x86 high performance CPU, even if it's 4 years old.
Click to expand...
Click to collapse
i agree.. when the multi-core CPUs first came out intel said doubling the core number would give as 47% boost in total performance (not x2 like apple says as they do not know it) lets assume that to be %50 to make the math a little bit easier..
so basically we can make the math here as; 4 cores at 1.2Ghz (when the all 4 active the clock is 1.2Ghz) gives us 1.2 x 3/2 x 3/2= 2.7 Ghz single core performance.. this value for SGS3 is; 1.4 x 3/2 x 3/2= 3.15Ghz
and here we can say dual core at (X) Ghz gives us (X) x 3/2=2.7 thus the (X) = 1.8 Ghz.. so, if you overclock any arm9 based Dual CPU to 1.8 Ghz you get the same performance "on paper".. if you want to catch up with SGS3 we need to OC it to 2.1 Ghz which is impossible at the moment i guess..
what makes the difference here is the lower loads or multiple loads on the CPU.. corecontrol users probably would have noticed; sometimes when the all 4 core are active the clock is only 480 or 640 Mhz (even 320 sometimes if i remember correctly) .. the same amount of load could be taken care of by a dual core at about 720 or 960Mhz.. but here the quad core system stays cooler with a little less energy consumed (or wasted) (as long as all the cores are in one uni-body structure, putting 2 or 4 single cores phsically together is not the case for our smartphones) this is how apple made sure about the smoothness of the ipad 2, new ipad and the iphone 4s.. they used lower clocked 2 power vr 543 GPUs.. when the load is little they can clock down to very low speeds and share the load..
and also you can always find an emtpy core waiting for new task when the others are busy..
so, long story for short; if we were dealing with a little amount but hard processes, having a single core at 2.7Ghz would be good since the quad core design would not cut one task into 4 pieces... as long as we were not thinking about the battery life and the heat.. but since we are dealing with lots of tasks which all could be handled by 1.2Ghz power having 4 cores is better for battery saving and having an empty core for a new task to run parallel with the other running tasks in the background..
It is OC out of the box I think Nvidia OC them for us and it's already pushing itself at the very edge of what is possible for it to do based on temperature, I seem to remember Hamdir saying something along those lines once upon a time...
Why bother to OC it's fast enough as it is.
---EDIT---
hamdir said:
only faux kernel betas allow OC
big warning OC is bad for the HOX given the thermal envelope
you are risking both you battery and processor if you OC
i know you are used to OC from other devices but those had headroom, it is not the case this time, T3 is operating at its max thermal capabilities on the HOX
Click to expand...
Click to collapse
hamdir said:
the snapdragon 2 on the Arc had a lot of headroom
the chipset is rated @ 1.5ghz stable!
not the case with T3 its milking the very maximum of the 40nm process
in other words Nvidia is OCing its T3 out of box because their chips are designed to survive massive amount of heat (sadly it doesnt mean the battery or other components would survive)
it is already Overclocked lol
sometimes you have to listen to the "science" of it and surrender
Click to expand...
Click to collapse