Finally we have an official response from Motorola.
I'm quite surprised but it's true, defy is equipped with omap3610.
BUT also, it has been confirmed that defy has a PowerVR SGX 530 renderer inside.
http://community.developer.motorola...elopment-resources-available/td-p/7794/page/3
What's your opinion? Why an omap3610 + separate graphic renderer instead of omap3630 (which has the same renderer inside)?
Like i said i'm quite surprised, i was pretty sure the defy's core was 3630. But in my opinion what really matters is the graphic renderer, and it is present!
I can only assume it was cheaper to buy and/or productize the two separate chips as opposed to the 3630.
i can' believe that answer......
'omap3610 + separate graphic renderer' is cheaper than omap 3630 ??
is it possible?
Actually one can Believe that do to Yield when producing the product. Its the same thing with multi-core chips. Putting 2 dual core chips into a CPU is cheaper in the long run than putting 1 quad core because of flaws in the Production of the cores. don't believe me look it up. It could have been a deal brokered to Motorola because of excess yield of a chip made it cheaper to buy them in massive quantity. Shortages of the chip you mentioned but availability of the 2 chip combo could have been another cause.
Well.. this is shaping up to be interesting.
I've heard this thing benchmarks like a beast.. so it will be interesting to see what happens with froyo..
It does make sense instead of an SoC like hummingbird they could have two dedicated chips.. I would imagine with SoC's they run into a lot of potential problems.. if one part on the chip is bad whole chip is bad..
Although it would be funny if they manage to shove an old 8800 into there somehow..
I think MOTO didn't tell the truth!
you wont hear that but ....
the 3610 doesnt has an ISP (Image Signal Processor) !!!
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
source: http://en.wikipedia.org/wiki/Texas_Instruments_OMAP#OMAP3
3630 has 3D and IVA 2+
Code:
IVA 2+ accelerator enables multistandard
(MPEG-4, H.264) encode/
decode and WMV9 encode at D1
(720 x 480 pixels), 30 fps
Up to 720p HD resolution
3620 has IVA AND 3D
3610 has IVA
Code:
IVA 2 accelerator enables
VGA to DVD video content playback
up to XGA/WXGA display resolutions
so defy cpu/gpu combo is more like 3620
source: http://focus.ti.com/lit/ml/swpt024b/swpt024b.pdf
That only affects camera picture quality. Phone cams are useless anyway.
I don't believe it.
Everything I have ever plugged the phone into/bench mark tools are telling me its a 3630, including RSD Lite. Nothing has ever told me its a 3610.
My personal opinion, is that Motorola don't want to advertise that it has this CPU as the phone is much cheaper than its "higher end" models and they don't want everyone going out and buying the defy over say the Droid X or Droid 2.
Whatever its got the bench mark results of this phone are awsome - and its on 2.1.
Higgsy said:
I don't believe it.
Everything I have ever plugged the phone into/bench mark tools are telling me its a 3630, including RSD Lite. Nothing has ever told me its a 3610.
My personal opinion, is that Motorola don't want to advertise that it has this CPU as the phone is much cheaper than its "higher end" models and they don't want everyone going out and buying the defy over say the Droid X or Droid 2.
Whatever its got the bench mark results of this phone are awsome - and its on 2.1.
Click to expand...
Click to collapse
Agree with you
I agree too. Lets get the camera to record 720p - then we will have evidence.
shorty66 said:
I agree too. Lets get the camera to record 720p - then we will have evidence.
Click to expand...
Click to collapse
yeah deffo
shorty66 said:
I agree too. Lets get the camera to record 720p - then we will have evidence.
Click to expand...
Click to collapse
yep,hope so
jellydonut said:
That only affects camera picture quality. Phone cams are useless anyway.
Click to expand...
Click to collapse
playback of 720p is a little laggy ....
The 3610 and 3630 have the same APU, right?
And apparently Motorola have paired the 3610 up with the same GPU that the 3630 has built-in, right?
So rather than it being a 3630 that's built in and Motorola are lying, isn't it more likely that the software that reports it to be a 3630 SoC is incorrectly identifying it due to the GPU?
I mean, as far as any software is concerned, if it's the APU that the 3630 has and the GPU that the 3630 has, then it must be a 3630.
Just a quick thought. Would a OMAP3610 have a external bus interface where you could hook up a SGX530 core to ?
Looking at the info for the AM37xx/DM37xx the only way you could hook up such a thing would be throu a memory interface, which IMHO is rather painful.
android.modaco.com/index.php?s=&showtopic=320961&view=findpost&p=1478526
android.modaco.com/index.php?s=&showtopic=320961&view=findpost&p=1483501
Just a rethorical question: if it's a 3610 with an "external" GPU, then why even device "says" it's a 3630? (Cannot post correct links, sorry.)
TPGamer said:
Just a rethorical question: if it's a 3610 with an "external" GPU, then why even device "says" it's a 3630? (Cannot post correct links, sorry.)
Click to expand...
Click to collapse
My post (2 above yours) offers (IMO) one potential theory why that's the case.
Step666 said:
My post (2 above yours) offers (IMO) one potential theory why that's the case.
Click to expand...
Click to collapse
I'm not great with low level hardware stuff, but win7/moto usb drivers identify the device by some kind of ID, right? The guy on mocado said, that the picture in his second post is made, when he was about to flash a new ROM on the phone. And the boot ROM is in the SoC, along with the USB controller. So in that case basicly the SoC identifies it self as a 3630. Correct me if I'm wrong, as I said before, I'm not great with this kind of stuff.
Related
new image from pocketnow:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Picture in WP7 Mondrian rom (showing blackstone with wp7 buttons)
a bit suspicious is that the accompanying specs that the tipster sent pocketnow differ from what we have seen references to.
Specifically:
Size: 116.7 x 58.7 x 11.9 mm
Display: 3.7" WVGA AMOLED capacitive touch screen
Camera: 8MP with auto focus with camera flash with High definition 720p video capture Internal memory: 16 GB flash, 512 MB RAM
Chipset: Qualcomm QSD8672, 1.5GHz
we're seeing references to a 4.3" display and a Qualcomm QSD8650A/B 1.3GHz CPU in the reg files of the rom.
Brandon Miniman (pocketnow.com)
(source)
here are some of my own (fake) artworks:
The second is just the old Touch HD with the buttons changed.
(you can actually remove that pic)
The first is the real deal.
My belief is that these are fake.
Reasons are two-fold:
1. The black spacing is too big. HTC would not release a design with this flaw.
2. Whilst the user is able to, OEMs are forbidden to remove or even move the four main tiles.
The four Microsoft tiles (Phone, Contacts, SMS and Email) must not be moved from the top of the list in their pre-defined locations.
Click to expand...
Click to collapse
From the WP7 customization document given to OEMs - here. I therefore doubt that HTC would release a mock-up design like this.
Casey
Ive herd that W7 will support multi-operating systems on one phone. any one else here that?
I have a feeling the keyboardless WP7 phones are all going to look boringly similar.
I have a feeling the OEMavatar is just an avatar of what you would see when you connected the phone to the computer, and synced to Zune. Something like that. Maybe its just used as an image throughout the system with no significance. However, the picture in question looks interesting. We all know that HTC has been advertising and shouting their name on the phones a little more [i.e. bigger logos on the back]. Now, as far as i know, i dont think theyve ever included the phones name on the back. notice, HTC "Mandrian", but then again, one cant help but wonder if HTC is going somewhere new with their phones. The dual flash looks a little suspicious as well. And the curvature of the chin is un-HTC like. Until we know more about the person who submitted the photo, its all speculation..
just my 2 cents.. :]
whatever it is HTC has in store, i'm quite confident it wont disappoint. Has anyone been disappointed with the specs/design of HTC's phones lately? i know i havent.
josh_prayyforplagues said:
one cant help but wonder if HTC is going somewhere new with their phones.
Click to expand...
Click to collapse
Well, the phone on the picture is anything but new, it's the good old HTC Touch HD.
A Qualcomm QSD8672 chipset? That would be crazy so I doubt it. On the other hand, devices featuring the QSD8672 chipset are expected to be launched in the second half of this year, so the time frame is right.
For those not familiar with the Qualcomm QSD8672 chipset, let's just say that it is a beast of a chipset. It features a dual core 1.5 GHz Scorpion CPU, a pretty decent GPU and 1080p encoding and decoding. The GPU is comparable, if not better, than what OMAP4, Tegra 2 and Samsung's Hummingbird have. The CPU is faster than anything else on the market and the chip has other benefits in comparison to the other SoC on the market. All in all, a very good chipset to have as your baseline specification of your platform.
But as I said, I doubt that the first WP7 phones will all have a Qualcomm QSD8672 chipset. One can always dream though :.
Quite frankly I think both pictures are fake. The first released picture like everyone else is saying is just a Blackstone. The second one is fake, because HTC doesn't actually have the OEM Name on the actual device. My Tilt2/TouchPro2 doesn't say 'Rhodium' on the back of the device. I don't think you would see 'Mondrian' on the back of this device when it comes out either.
Just think it's too early to speculate.
So many people with no clue here^^
So it's basically the HD3?
Not like we didn't see this coming. What I would like to see is a US launch of the device before the end of the year (as opposed to the HD2, which wasn't launched in the US until March).
The 2nd picture isn't a 'fake' but it could be a dummy image for the purposes of testing.
However....not sure if it has any bearing on reality or if they've ripped the info from other sources but the Mondrian is now listed on GSM Arena with it's specs etc and the same 'blackstone style' image.
http://www.gsmarena.com/htc_mondrian-3338.php
With regards to seeing 'branding' etc on the device, personally I feel that a prototype device might have branding for use within the design department of HTC perhaps.
a bit suspicious is that the accompanying specs that the tipster sent pocketnow differ from what we have seen references to.
Specifically:
Size: 116.7 x 58.7 x 11.9 mm
Display: 3.7" WVGA AMOLED capacitive touch screen
Camera: 8MP with auto focus with camera flash with High definition 720p video capture Internal memory: 16 GB flash, 512 MB RAM
Chipset: Qualcomm QSD8672, 1.5GHz
we're seeing references to a 4.3" display and a Qualcomm QSD8650A/B 1.3GHz CPU in the reg files of the rom.
Brandon Miniman (pocketnow.com)
Click to expand...
Click to collapse
The 1.5Ghz Specs are 100% fake. Why am I the only one questioning them?
"HTC HD3 by NAK ... My concept about the HTC HD3 ..."
http://nak-design.over-blog.fr/
http://nak-design.over-blog.fr/pages/HTC_HD3_by_NAK_-2727167.html
Mondrian Spec?
hxxp://product.pcpop.com/pk/276905_266752.html
Scroll down, it shows 1GB RAM and 1280x800 resolution??
philliphs said:
hxxp://product.pcpop.com/pk/276905_266752.html
Scroll down, it shows 1GB RAM and 1280x800 resolution??
Click to expand...
Click to collapse
..err...it also shows the Aria (the android phone) as having MS Office Mobile....don't think so somehow! I would say that this is made up from various sources and they havn't checked them very well.
1.3 Ghz processor whoa thats gonna be fast!!
micro USB???
No card slot??
i dont need to read anything more just to know this is sooooo fake!!!
+ Que PPC said:
micro USB???
No card slot??
i dont need to read anything more just to know this is sooooo fake!!!
Click to expand...
Click to collapse
MS have stated that WP7 devices will not have SD card slots and will come in various capacities instead...like the iphone (6gb, 16gb, 32gb etc) and there will be no file system access so they might have done away with micro usb and have a new connector type seeing as the WP7 phones need to be sync'd via the Zune software (also similar to the iphone + itunes)
welki1979 said:
MS have stated that WP7 devices will not have SD card slots and will come in various capacities instead...like the iphone (6gb, 16gb, 32gb etc) and there will be no file system access so they might have done away with micro usb and have a new connector type seeing as the WP7 phones need to be sync'd via the Zune software (also similar to the iphone + itunes)
Click to expand...
Click to collapse
so microsoft's next phone is a lot like the iphone
there are a lot of similarities yes....but there are also a lot of differences between the iPhone and WP7.
The truth is that until the release (Dec 2010) we won't know the full details as a lot of functions will also be device specific. People seem to forget that MS set a MINIMUM specification but not a maximum one (obviously) but there are a number of RULEs that they are also enforcing such as the 'no removable SD card' rule and no custom UIs.
Does anyone have a work around for the 30 fps cap the Incredible has on the GPU?
What 30fps cap?
More like, it just doesn't have enough balls to hit 30+ fps, most the time
hahaha true true. I've read that HTC has capped the EVO and the Incredible at 30fps on the GPU.
No, just the Evo. Something to do with HDMI.
I would expect a higher quadrant score with the 1ghz processor the Droid X almost puts up the same numbers and it's running 2.1.
The GPU is making up for it. Quadrant is very GPU oriented.
That and the OMAP is faster. So its a lose-lose situation for us. The GPU in the X walks all over ours.
Kind of like integrated graphics vs a discrete graphics card in a computer.
Makes sense...too bad though I'm the type of guy that always has to have the latest and greatest don't get me wrong the incredible is a great phone can't beat the form factor but I wish it had a little more power under the hood in the area of graphics processing.
HeyItsLou said:
Makes sense...too bad though I'm the type of guy that always has to have the latest and greatest don't get me wrong the incredible is a great phone can't beat the form factor but I wish it had a little more power under the hood in the area of graphics processing.
Click to expand...
Click to collapse
I agree 100%.
adrynalyne said:
The GPU is making up for it. Quadrant is very GPU oriented.
That and the OMAP is faster. So its a lose-lose situation for us. The GPU in the X walks all over ours.
Kind of like integrated graphics vs a discrete graphics card in a computer.
Click to expand...
Click to collapse
It's pretty interesting actually. I thought the same thing, until I saw quadrant run on a droid X. It does get higher fps in most of the 3d tests, (the rotating planet/moon is really choppy and in the low teens of fps)
Anyway, because I had nothing better to spend my money on, I bought the advanced version of quadrant.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
The GPU does score a bit higher in 3d, though for whatever reason the I/O (read/write test) scores crazy high compared to anything else. That's where the most
of the score is coming from...
but on topic, there is no framerate cap. Download fps2d fro the android market and see for yourself.
It's interesting that you say that. I've been doing a lot of development work lately on the DInc and when transferring files I have been getting blazing speeds... much higher than on any other solid state/card reading device I own (why couldn't Nvidia stick a sexy little GPU in there... *sigh* just have to wait a couple years).
I think the lack of 3D performance on the Incredible is due to the drivers HTC includes for the Adreno GPU. I've read that the drivers they include aren't optimized or tweaked at all.
I remember there being a thread a while back about porting drivers from the Samsung devices using it since they were getting much better performance out of the same GPU. Anyone remember that or have any new info on that development?
I'm very disappointed with the Slackdragon chip. I hate the slow downs during game play. Asphalt consistently crashes on this phone.
The guys using the more than 3yr old HTC Vogue just got a new kernel and are enjoying over 30fps on Neocore. I have never reached 26fps. Granted they have DZO who's a brilliant programer but there is no reason why a phone that's years old and is not even an Android phone (at least not native) should be anywhere close to our phone. That phone came with a 400mhz chip and no drivers for the GPU.
I love the form factor and screen on this phone but I doubt any amount of overclocking will ever solve its problems. My next phone will definitely not have Qualcomm chip.
Sent from my ADR6300 using XDA App
I enjoy the phone for what it is.
Otherwise I would have returned it. If someone bought this phone thinking it was a gaming powerhouse...LOL?
Buyer beware, research first.
I will take a 1ghz CPU over a 550mhz CPU, even if it plays games better. I need it to be fast as a phone, not crappy slow except for games.
When Android starts being GPU driven, then I will pay more attention to faster gpus.
You can have the fastest GPU of all, and still run like crap. Ask Samsung Galaxy S owners who are suffering lag and slowness due to apps installing only to sd internal storage But hey, they run games fast!
Dude I didn't buy the phone to replace my xbox but it'd be nice if games didn't crash on my phone. On paper the snapdragon seems great but in all actuality the first Droid isn't far behind at all at half the speed and it out performs this phone when overclocked to 1.2ghz.
I still really like this phone especially now with Froyo but that doesn't mean I can't wish the snapdragon chip wasn't a slacker. I did my research and unless I went with the iphone (which I'd never get) I was not get better 3d performance...on paper. Even video slows down on this phone sometimes. That just doesn't seem right
Sent from my ADR6300 using XDA App
Take Froyo on both phones.
Run Linpack with Moto at 1.2ghz, and Dinc at 1ghz.
Chuckle at how much higher score the Dinc gets.
The game performance is about the same (with oc'ed Moto Droid taking the lead), but the rest is not even in the same ballpark as the Dinc.
It's likely capped at 30 fps to sync with display refresh rate (otherwise known as v sync ) and yes with modifications to the drivers it will be possible to increase this. But as for now I agree with adrenaline and enjoy the phone for what it is.
Don't get me wrong I'm always game for more performance but even at stock clock speeds the incredible is plenty fast. Faster gpu would be at the bottom of my list for tweaks to be made. Even so I'm sure someone will be working on it soon enough.
Sent from my ADR6300 using Tapatalk
The device is released on the 3rd, it's time to open a forum section and get some discussion going.
This thread agrees with you...
http://forum.xda-developers.com/showthread.php?t=787007&page=2
BUMP
The defy is now actually available, rooted and confirmed to have a GPU.
Anyone interested?
There is interest in this phone, but not as much as I would have thought.
You'd think a ruggedized phone like the DEFY would get more interest, but then again it hasn't been out for even a week yet.
Maybe after it has been out for a while there will be more people that take an interest in it.
Here is hoping so!!
shorty66 said:
BUMP
The defy is now actually available, rooted and confirmed to have a GPU.
Anyone interested?
Click to expand...
Click to collapse
really? good news!
I posted the "evidece" on gpu and root in the other thread.
There is a one click root solution avaible.
I dont know if the bootloader has been hacked yet though.
shorty66 said:
I posted the "evidece" on gpu and root in the other thread.
There is a one click root solution avaible.
I dont know if the bootloader has been hacked yet though.
Click to expand...
Click to collapse
bring it on!
+1 for a section on this phone I plan to get soon!
+1
I have a Defy arriving tomorrow. Would love to see a forum for it!
+1
Motorola Defy arriving tomorrow!
+1 Can't wait to get this phone! My friend just picked it up, and I got to play with it.
Quadrant scores were right along a Vibrant, hopefully it gets Froyo before the end of the year, but I could wait, it was more than fast enough.
Weight was about the same as the Vibrant, but the form factor is awesome, the screen being 3.7 and the Vibrant being 4, you'd never guess it was that close cause the Defy is so much smaller in the hand. It's just all screen.
Games loaded fine, probably due to the PowerVR GPU. And the web browsing seemed pretty quick as well. Sound quality was also like most Motorola phones, really good, these guys can build radios and phones. And the screen is great in direct sunlight, think it's got a transreflective tft under that Gorilla glass.
But all in all, I don't understand why there isn't more hype on this phone? I'm thinking this is one of those phones that would agree with many people. Not a 4.3 monster, nor a 3.2 mini. Slightly bigger screen than an iPhone, but smaller form factor and lighter. More than powerful enough to run anything, and they made it life proof...I know in the current market, there are front facing cameras, and 4g. But really most people don't need either. They're great features for a select few, but the majority of people don't need a front facing camera or would come close to utilizing 4g. Look at how many people kept their blackberrys and iphone 1 and 2gs.
Hopefully community support with grow for this phone!! think it's time for me to start learning how to cook up some roms soon! I'm getting one today!
+1 for a Defy section
I would like to have a Phone without a need for Extra Protection.
That is the reason because I am very interested in this Phone.
can only hope some good hackers and cooks buy this phone too.. Custom ROMs are a must.
+1
my defy will arrive next week on monday
+1
This phone looks sweet except that the proc is only 800 and the VR GPU is not included. But the rugged factor is a sale for me. I was using a mt3g in the field yesterday as a stopwatch, gps, level, camera, notes, tethering and for tunes! ...blowing dust and sprinkling rain were the only concerns and they would have been mitigated with this rig! Bring on the new era these phones have the capability of lessening my pack list for field days.
The defy actually HAS a PowerVR530 included.
The official Specs have been updated accordingly and Quadrant says so too on my Defy (which came yesterday ).
shorty66 said:
The defy actually HAS a PowerVR530 included.
The official Specs have been updated accordingly and Quadrant says so too on my Defy (which came yesterday ).
Click to expand...
Click to collapse
sadly ....you are mistaken. I wish it had a powerVR but it does not. its purely a CPU. I found it on a file on the TI site
http://focus.ti.com/lit/ml/swpt024b/swpt024b.pdf
Regardless it needs its own section so we can get some development going on it.
The defy does have a Power VR. It's a simple mistake made by Motorola or TI with the OMAP 3610. It definitely does.
GPU (OpenGL)
Vendor: Imagination Techonogies
Renderer: PowerVR SGX 530
Version: OpenGL ES-CM 1.1
Max texture units: 4
Max texture size: 2048
Max lights: 8
VBO: supported
Frame buffers: unsupported
Cube maps: supported
Texture combiners: supported
Crossbar combiner: supported
androidpit.de/de/android/forum/thread/406478/Defy-Power
android-hilfe.de/motorola-defy-forum/52754-klaerung-defy-gpu-cpu.html
Already confirmed by Motorola. This misinformation is a reason why people are not looking into the phone. It's an OMAP 3630 actually
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
MTDG said:
The defy does have a Power VR. It's a simple mistake made by Motorola or TI with the OMAP 3610. It definitely does.
GPU (OpenGL)
Vendor: Imagination Techonogies
Renderer: PowerVR SGX 530
Version: OpenGL ES-CM 1.1
Max texture units: 4
Max texture size: 2048
Max lights: 8
VBO: supported
Frame buffers: unsupported
Cube maps: supported
Texture combiners: supported
Crossbar combiner: supported
androidpit.de/de/android/forum/thread/406478/Defy-Power
android-hilfe.de/motorola-defy-forum/52754-klaerung-defy-gpu-cpu.html
Already confirmed by Motorola. This misinformation is a reason why people are not looking into the phone. It's an OMAP 3630 actually
Click to expand...
Click to collapse
Thats interesting. How did you find that info? I mean I hope its right. Just doesnt make sense cUse the phone is still slugish during screen transitions or pinch to zoom.. Maybe its the motoblur doing that but for that kind of gpu it should be much better. I installed laumcherpro, it helped alittle.
First, here's how I scored my Nitro for $50 (on contract).
Currently AT&T (instore and online) is selling the Nitro with an instant $100 off.
If you go to http://www.retailmenot.com/view/att.com and click through their AT&T Promo link that's another $100 off. So now the phone costs $50 (plus tax) and free 2-day shipping.
As you can see in my signature, I'm a Nexus One user. The very MINUTE the phone was available for order on their site, I did.
I was a devout Nexus fan (actually, still love my Nexus One), I love the notion of no-bloat and completely open ended system. When the Galaxy Nexus was due, you can imagine my excitement. Now imagine my complete frustration with the constant delays and misinformation. I was also quite disappointed that Google decided to put the 1.2GHz Exynos CPU in their supposed "Hardware platform pusher" phone. Blah!
But during the time I eagerly awaiting the GNex, I see a new offering from Samsung in the salacious Galaxy Note! a 5.3" beauty with all the guts of the GNex (actually, a FASTER processor with 1.4GHz Exynos) and a stylus with Wacom certified screen!...But dammit!, no US version! The International GSM version will work on AT&T, but NOT 4g LTE! And the Note's Verizon LTE version would have the snapdragon with the Adreno 205 GPU in it (OLD, WEAK GPU!!).
Too many strikes against the front runners and too many yummies for the Nitro underdog!
Are you able to upgrade a line using this, or is it just New Customers/Add a line?
I'd imagine this is for New Customers only...At least for the extra $100 from retailmenot:
*Offer requires online activation via att.com/wireless on qualified rate plans $39.99 or more with a two-year agreement. iPhone, GoPhone, netbooks, and certain other devices not eligible. Certain devices require a minimum data or messaging plan. Discount will be automatically applied to your shopping cart. Credit approval and other restrictions apply.
I can't get any of the discounts to show up, at least over a week later.
Sadly that RetailMeNot discount could be regional. I've heard quite a few others had issues getting the discount to work in other areas (I'm in the NJ area).
Another good reason to choose the Nitro over those two - no screen burn-in.
There are more to processors than the mhz rating. the 1.2 A9 in the gnex absolutley destroys the nitro's 1.5....Also take a look at the development between both, enough said!
Just curious, was/is Glaxy Nexus expected for AT&T network?
[email protected] said:
Just curious, was/is Glaxy Nexus expected for AT&T network?
Click to expand...
Click to collapse
Yes, based on leaked FCC specs 3G only, no LTE. But this is kind of speculation only.
droidstyle said:
There are more to processors than the mhz rating. the 1.2 A9 in the gnex absolutley destroys the nitro's 1.5....Also take a look at the development between both, enough said!
Click to expand...
Click to collapse
"absolutely destroys" the Nitro's 1.5? Really? Any actual proof of that? And I'm not talking about specsheet comparisons, I mean actual benchmarks.
Seriously though, I've been searching the net and can't find any decent benches for comparison between these 2.
Namuna said:
"absolutely destroys" the Nitro's 1.5? Really? Any actual proof of that? And I'm not talking about specsheet comparisons, I mean actual benchmarks.
Seriously though, I've been searching the net and can't find any decent benches for comparison between these 2.
Click to expand...
Click to collapse
The Snapdragon APQ8060 is just naturally slower then the 1.2GHz OMAP4 in the GNexus, because it's Cortex A9.
Qualcomm used a modified Cortex A8 design for more battery efficiency, but you sacrifice horsepower.
Now the 1.2GHz Exynos chip... that is a battery drainer, but the amount of horsepower that thing has is just godly.
droidstyle said:
There are more to processors than the mhz rating. the 1.2 A9 in the gnex absolutley destroys the nitro's 1.5....Also take a look at the development between both, enough said!
Click to expand...
Click to collapse
nitro destroyed by galacy nexus:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
scott0 said:
nitro destroyed by galacy nexus:
[image]
Click to expand...
Click to collapse
Time after time benchmarks have proven to be inaccurate. The Galaxy Nexus's processor is much faster than the Nitro's.
8mileroad said:
Time after time benchmarks have proven to be inaccurate. The Galaxy Nexus's processor is much faster than the Nitro's.
Click to expand...
Click to collapse
i was replying to the following poster's request for a benchmark, i provided one. you are free to interpret it anyway you choose, even discount it entirely, it makes no difference to me. I interpret it to correspond to my real world use of the nitro. as experience proves time and again, it's not always a faster processor that makes a fine performing phone.
Namuna said:
"absolutely destroys" the Nitro's 1.5? Really? Any actual proof of that? And I'm not talking about specsheet comparisons, I mean actual benchmarks.
Seriously though, I've been searching the net and can't find any decent benches for comparison between these 2.
Click to expand...
Click to collapse
Longcat14 said:
The Snapdragon APQ8060 is just naturally slower then the 1.2GHz OMAP4 in the GNexus, because it's Cortex A9.
Qualcomm used a modified Cortex A8 design for more battery efficiency, but you sacrifice horsepower.
Now the 1.2GHz Exynos chip... that is a battery drainer, but the amount of horsepower that thing has is just godly.
Click to expand...
Click to collapse
This is yet more conjecture. After more digging I've attached an image I found from one of Anandtech.com's articles, showing a comparison chart between the ARM Cortex A8, A9 and our Nitro's Qualcomm Scorpion.
Indeed while searching for more technical information on the architecture of the APQ8060 chip I found some info that it's based on the ARM Cortex A8, but if you look at the attached image...The Scorpion is mostly equal to or HIGHER SPEC than the A9, such as in pipeline-depth and NEON.
Now, since we're talking about System-on-Chip here, let's not forget what GPU the GNex is using vs. the Nitro... PowerVR SGX540 vs. the Adreno 220 in the Nitro. Doesn't take much searching to find the Adreno 220 is superior to the SGX540 and here's a few benches showing the Nitro higher on the performance results vs. GNex:
OpenGL-ES 2.0 graphics performance benchmark:
http://nena.se/nenamark/view?version=2
As to the remark that benchmarks are inaccurate...Well I certainly agree that I've seen my fair share of benches being debunked or showing inaccurate info based on outdated software, but at least their TESTED results and not just opinions based on spec sheets.
OpenGL-ES 2.0 graphics performance benchmark:
i have a few more days before my 30 days is up with the nitro, seriously debating whether or not to return it and just get an SGS2, because the nitro doesn't seem to be selling in volumes enough to get real dev support
So from my understanding,
The international one is Tegra 3 while the ATT model is a S4?
Is there a huge difference in performance between the two?
i mean.. one is quad,,and the other is only a dual..
tian105 said:
So from my understanding,
The international one is Tegra 3 while the ATT model is a S4?
Is there a huge difference in performance between the two?
i mean.. one is quad,,and the other is only a dual..
Click to expand...
Click to collapse
There's fairly lengthy disucssion on this over on the One XL forum...
http://forum.xda-developers.com/showthread.php?t=1609878
Yes but the S4 has proven that quad isn't necessarily better than dual! The performance is almost the same, in some cases the S4 is even better (not THAT better tho) but in others (like multitasking and gaming) the Tegra 3 really shines
*performance-wise*
if they both had 32 gb, then i'd have to decide. as is, i don't want a 16 limit.
Considering that ICS is GPU accelerated when It comes to UI, wouldnt the T3 version feel smoother than S4? (gestures and UI interactions, not loading speeds) Since its GPU is better capapble than the Adreno 225. I mean thats why the GNEX doesnt feel as smooth as a SGS2 for example, the dual core TI OMAP has some seriously powerful CPU but subpar GPU to push that 720p screen...
Just a thought
Sent from my HTC PH39100 using Tapatalk 2
AT&T: 28 nm Cortex-A15 Base
Int'l: 40 nm Cortex-A9 Base
It's like comparing a 1st gen quadcore to a current gen dualcore in the pc world (maybe not that drastic, but you get the point)
The Adreno 225 is only marginally slower in gaming, not enough to make a fuss over. According to my contact at AT&T it is perfectly smooth.
http://briefmobile.com/htc-one-x-snapdragon-s4-krait-vs-nvidia-tegra-3-comparison
Thank everyone for posting your opinions!
After reading the link BarryH_GEG Provided,
I have made the decision to passup the X and to get the XL instead. For the following reasons:
1. better battery life.
2. LTE
3. More snappiness due to OS utilization?
4. Quadcore is useless to me in real life. Why? because the first quadcore desktop CPU launched almost 7 years ago. Even until now, software developers are just slowly learning to utilize more than two cores.
Ready to buy, tomorrow!
And thank everyone again
designgears said:
AT&T: 28 nm Cortex-A15 Base
Int'l: 40 nm Cortex-A9 Base
It's like comparing a 1st gen quadcore to a current gen dualcore in the pc world (maybe not that drastic, but you get the point)
The Adreno 225 is only marginally slower in gaming, not enough to make a fuss over. According to my contact at AT&T it is perfectly smooth.
http://briefmobile.com/htc-one-x-snapdragon-s4-krait-vs-nvidia-tegra-3-comparison
Click to expand...
Click to collapse
on a side note, will you be developing for the XL ?
designgears said:
AT&T: 28 nm Cortex-A15 Base
Int'l: 40 nm Cortex-A9 Base
It's like comparing a 1st gen quadcore to a current gen dualcore in the pc world (maybe not that drastic, but you get the point)
The Adreno 225 is only marginally slower in gaming, not enough to make a fuss over. According to my contact at AT&T it is perfectly smooth.
http://briefmobile.com/htc-one-x-snapdragon-s4-krait-vs-nvidia-tegra-3-comparison
Click to expand...
Click to collapse
The snapdragon s4 is based on krait which is totally different from A15. Performance is only slightly better than A9 in terms of integer but vastly better for floating point and memory bandwidth. The Adreno 225 is much slower than the geforce in tegra3. But it wont be felt in general UI and games since they hardly tax the 220 before it and even the several times slower 205 runs most games like deadspace without a problem.
tian105 said:
because the first quadcore desktop CPU launched almost 7 years ago. Even until now, software developers are just slowly learning to utilize more than two cores.
Click to expand...
Click to collapse
i actually bought the Intel Core 2 Quad Extreme quite a few years ago and the only thing i have had to upgrade for gaming is the video card and RAM. i imagine i'm going to get a few more years of high quality gaming out of this PC before i need to upgrade. i don't have DDR3 RAM or PCI Express 3, but i doubt it matters much. i just hope the CPU doesn't die before then! i'm just shocked at how long i've been able to stay on the quad, and now they have 8 cores?
tian105 said:
on a side note, will you be developing for the XL ?
Click to expand...
Click to collapse
lol..we will wish him good luck
tian105 said:
on a side note, will you be developing for the XL ?
Click to expand...
Click to collapse
I sure plan on it, hope to release a rom at some point for it.
Trying to dump the X for the XL(AT&T) right now.
You get one thing with Teg3 you don't with the S4. A really cool game. It's built in to the system and really visualizes the Teg3 chip in action. And being the only Teg3 phone on the market, it's a One X exclusive.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
That's right! It's your battery indicator! Imagine how impressed your friends will be when they actually get to watch your battery life count down in real time. Thier phones can't do anything as trick.
BarryH_GEG said:
You get one thing with Teg3 you don't with the S4. A really cool game. It's built in to the system and really visualizes the Teg3 chip in action. And being the only Teg3 phone on the market, it's a One X exclusive.
That's right! It's your battery indicator! Imagine how impressed your friends will be when they actually get to watch your battery life count down in real time. Thier phones can't do anything as trick.
Click to expand...
Click to collapse
LOL, nice one
brent8577 said:
i actually bought the Intel Core 2 Quad Extreme quite a few years ago and the only thing i have had to upgrade for gaming is the video card and RAM. i imagine i'm going to get a few more years of high quality gaming out of this PC before i need to upgrade. i don't have DDR3 RAM or PCI Express 3, but i doubt it matters much. i just hope the CPU doesn't die before then! i'm just shocked at how long i've been able to stay on the quad, and now they have 8 cores?
Click to expand...
Click to collapse
my point exactly!
cheers!
designgears said:
LOL, nice one
Click to expand...
Click to collapse
I'm sounding *****y but I'm really happy with the phone. After some charging cycles I'm getting decent battery life based on the way I use it. I'm going on a trip Friday which will be the real test as I'm much heavier on my phone when I'm on the road. I'd still recommend it and I like the larger amount of storage. So don't let my sarcasm put people off, it's still a great device.