Related
Preface:
Although I have been "reading the mail" for a while, I am a new member so I cannot put this in the same thread as the existing wireless mod this is meant to update. Likewise, I cannot link to my pictures directly until I get at least 10 posts submitted so the smaller attached versions will have to do for now. With that said, I see many people are not happy about the lack of native wireless charging support in regards to the T-Mobile Note II (T889). With over 18 years of engineering experience and a T889 myself, I decided to investigate this issue. After 10+ hours of tearing down my own Note II and trying many of the options others have already attempted, I have a few findings.
A. The core design of the T889 is the same as the i317 and N7100. Samsung even uses a mixture of the various device parts to make the T889; including the main system board.
B. The only obvious difference looks to be the actual core processor, which has the hard coded framework (similar to a BIOS) that controls the fundamental parameters of the radio and device capabilities. This framework also takes presentence over any middleware ROM loaded on to the device so there is only so much that can be modified (liberated) without causing too many underlying issues. This method of development is common when creating customized versions of a device for multiple clients while also making sure the FCC part 15 type acceptance regulations here in the USA are adhered to without resubmitting each device multiple times for each client version.
C. In the case of the T889, I believe the ability to facilitate the 802.11 calling feature (either ON or OFF) cost the device its ability to also allow for wireless inductive charging. It may be the stray RFI created by both functions operating at the same time would either not pass FCC type acceptance or the device was never originally type accepted with both features enabled at the same time. Regardless, my overall conclusion is the wireless charging feature in the T889 is disabled on the hardware level and there is no cost effective nor reasonable way to change this.
Other notes:
The above may also explain why the N7100 USB charging board swap doesn't work and why the T889 crashes when loading the home screen; even if a N7100/i317 ROM is loaded on the device. It looks like the processor and/or power management chip is hard-coded to poll the little logic chip on the T889 and i317 charging boards regardless of the ROM. If it does not see that chip (the case with the N7100 USB charging board), it cries foul and triggers a reset (shut down) command. So, it is what it is and we must "skin the cat" a different way.[see image usb_cb_lg.png]
Updated Mod:
The current wireless mod to enable inductance charging on the T889 is a great option, but while ripping my T889 apart, one action item I wanted to do was see if I could make the modification easier and cleaner at the same time. I believe I have achieved this. With my version of the modification, the distance from point A to point B has been reduced to less than 45mm and there is no need to remove or navigate a conductive trace from the charging board/speaker housing assembly. With that said, let's begin. -
* Please proceed at your own risk knowing that you will most likely void your warranty performing this mod. I nor anyone else will be held responsible if you fail to perform this modification correctly and physically damage your device. You have been warned.
- This procedure requires the use of the following tools:
1. A non-conductive and clean work space such as a wood table or natural stone counter top.
2. A soft micro-fiber cloth or pad to lay the device on while working on it.
3. A multi-meter capable of reading resistance in Ohms and DC voltage.
4. A small razor blade.
5. A #0 Phillips head precision screw driver.
6. A small plate to keep the fasteners secure while working.
7. Common case separating pick
8. A professional temperature controlled soldering iron with a pencil tip no larger than 2mm in width – set to no more than 725̊ F.
9. A small roll of rosin core 60/40 solder.
10. 45mm of AWG-30 [0.01" / 0.255mm] or similar sold core vinyl sleeved wire.
11. Fine tip tweezers.
12. Experience and patience.
Now, the good stuff!
- The new location to tap into the +5VDC is on the (+) side of the zener diode located just north of where the charging board header connector feeds the main system board. Form there, it is only 40mm to the positive voltage side of the wireless charging contacts. [see images t889wcmsp.png and t889_wmod_a.jpg]
- This is how is should look like once done... [see image t889_wmod_b.jpg]
- Note the small compression mark caused on the wire when I first tested the casing reassembly. This can be rectified by carefully cutting a small "V" shaped groove in the casing fastener hole cross member. This will also facilitate a 1-2mm gap guide on the inside of the casing wall once reassembly is complete. [see image t889_wmod_c.jpg]
Once done, install your compliant (5vDC @ 1000mAh) inductance receiver plate and you are good to go with a clean voltage line that will be reliable and allow you to charge the T889 via any QI compliant charging pad such as the one made for the Samsung S4.
Have fun and be safe,
Scott
Bye the way, I have also attached a pic (see image t889pcb_sm.png) showing both sides of the T889 main system board with a message on where to get the higher definition version. It is very enlightening regarding what makes up the T889 and what I have been talking about.
Here are some additional pics of the mod working with the new Samsung S4 wireless charger that was on sale last week.
Scott-
What type of wireless charging pad did you use to put in the back of the phone? Links/pics would be great to show a 100% guide.
When placed on the wireless charger, do you get a confirmation popup every time it's placed on it?
How fast is the wireless charging rate? Same as plugging it in?
imaleecher said:
What type of wireless charging pad did you use to put in the back of the phone? Links/pics would be great to show a 100% guide.
When placed on the wireless charger, do you get a confirmation popup every time it's placed on it?
How fast is the wireless charging rate? Same as plugging it in?
Click to expand...
Click to collapse
The receiver pad is a common 5vDC up to 1.0amp "Note2 N7100" QI compliant pad you can find easily on eBay. Since I do not have enough posts yet, I cannot link to the source I used, but again, they are easy to find.
Re: Notification
Since the T889 does not natively support the wireless charging feature, there is no "wireless charging" notification popup present. However, functionally, everything works perfectly fine and the same notifications apply whether you place the T889 on the Samsung charging pad or plug in the traditional USB charging/sync cable.
Wireless charging pad with the T889 OFF:
- The charging pad light first turns ON/Green within 1 second of placing the unit on the pad properly.
- Then, within another 2-3 seconds, the T889 launches the battery icon and begins the charging process.
- After about 30 seconds or so, the battery icon and screen go to sleep and the red charging indicator LED takes over.
Wireless charging with the T889 ON:
- The charging pad light first turns ON/Green within 1 second of placing the unit on the pad properly.
- Then, within another 1-2 seconds, the T889 produces the two-tone (default) audible charging notification with the lightening bolt showing up on the display inside the battery icon.
- The charging process is now active and doing its thing.
Although it acts the same way as charging via the USB port, the efficiency of wireless charging is still in the 75-80% range so in reality and accounting for the circuit overhead that peaks out at 1000mA, my initial finding is the charge rate is about 15% per hour when paired with the stock 3100mA battery (or approximately 6.75 hours for a complete charge). Regardless, it is still pretty snappy and I will test it some more over the coming days.
One thing that should be noted is the 1000mA receiver pad takes up 100% of the marginal space in the T-Mobile rear cover plate so do not try this with anything larger than the standard 3100mA battery unless a deeper rear cover plate can be sourced.
Scott-
One little note, the 1000 Ma can not be achieved with stock kernel, I use Perseus kernel that can tweak the USB charging from 450 ma, to 1000.
For an accurate (more or less) measurement, use Galaxy Charging app from Play Store.
Sent from my SGH-T889 using xda premium
premiatul said:
One little note, the 1000 Ma can not be achieved with stock kernel, I use Perseus kernel that can tweak the USB charging from 450 ma, to 1000.
For an accurate (more or less) measurement, use Galaxy Charging app from Play Store.
Sent from my SGH-T889 using xda premium
Click to expand...
Click to collapse
Thanks for the heads up. I will look into this.
Scott-
Alright, here are the enlightening results of the wireless and USB direct charging current utilizing Galaxy Charging Current Pro v1.6.
As stated, I am going to continue to test the actual charge times more in the coming days. :good:
Scott-
Amazing work. It's always great to see people still tinkering with their devices.
Today, I changed the stock kernel to the Saber variant and I am liking the results. With no others changes or mods, I am now seeing 900mA of the 1000mA the QI receiver pad is spec'd at.
I was also able to charge from 81% to 91% in exactly 25 minutes with the T889 ON, the screen awake, Wi-Fi active, and the phone idling. This would equate to approximately 4 hours and 10 minutes per complete charge, which is very respectable.
Scott-
Just wanted to give thanks to the very in depth" and professional manor of which you take time explaining all of this along with very :thumbup:detailed illustrations.
If you know so much about modifying the hardware in such a way to achieve this, you could very well be the guy every android owner has been waiting for
to maybe one day software mod these phones to one day get 3-4 day batt life regardless of any battery mah size
Once again thanks for the info
Sent from my SGH-T889 using xda premium
lojak29 said:
Just wanted to give thanks to the very in depth" and professional manor of which you take time explaining all of this along with very :thumbup:detailed illustrations.
If you know so much about modifying the hardware in such a way to achieve this, you could very well be the guy every android owner has been waiting for
to maybe one day software mod these phones to one day get 3-4 day batt life regardless of any battery mah size
Once again thanks for the info
Sent from my SGH-T889 using xda premium
Click to expand...
Click to collapse
Thanks, but I am more on the hardware side and not the software arena. Plus, this mod concept was originally not my idea so I cannot take complete credit for it. I simply investigated the situation and found yet another way to achieve the end result. Regardless, the battery chemistry is just not there yet so if you want 3-4 day operation between traditional charges, I suggest looking into one of these new solar and other alternative charging options.
http://www.hongkiat.com/blog/extraordinary-smartphone-chargers/
For now, I am back to enjoying this current QI solution and have established a repeatable charging cycle time that is within 5 minutes on each benchmark. With the Saber kernel allowing for 900mA of wireless charging current to make it through the circuit, the 0-2% to 100% charge time looks to be 4 hours, 25 minutes (+/-5 min). My latest charge cycle that just completed is shown below.
I'm happy -
Scott
Thanks for this info.
I have an AT&T SGH-I317 with a wireless charging pad. It works but is fairly slow charging at about 460 mA. I've read that the original mod with the Perseus kernal would allow the device to charge faster.
Before I jump into it I'd like some confirmation that this mod would give my wireless charging a boot on my AT&T Note II. I haven't found a lot if complete info and am concerned.
Thanks...
RojasTKD said:
Thanks for this info.
I have an AT&T SGH-I317 with a wireless charging pad. It works but is fairly slow charging at about 460 mA. I've read that the original mod with the Perseus kernal would allow the device to charge faster.
Before I jump into it I'd like some confirmation that this mod would give my wireless charging a boot on my AT&T Note II. I haven't found a lot if complete info and am concerned.
Thanks...
Click to expand...
Click to collapse
As your i317 is already QI capable, then there is no need to perform this mod. Your specific focus should be confirming the following before flashing the kernel:
1. Is your QI charging (transmitting) platform rated at 2A (input) with an effective output of approx. 1A? Likewise, is the wall adapter feeding the pad rated at up to 2A?
2. Is the QI receiving pad mounted on your i317's rear case panel or the battery rated at 1000mA @ 5V? [Anything lower like 700mA will obviously limit your charging rate, which results in longer charging times].
I have not tested the Perseus kernel, but the Saber kernel on my T889 does liberate the higher charging rate as long as the hardware TX/RX pads support it (as covered above). I average about 980mA, which is within 5% of the rated peak for my Samsung charging platform.:good:
Scott-
SGBE said:
As your i317 is already QI capable, then there is no need to perform this mod. Your specific focus should be confirming the following before flashing the kernel:
1. Is your QI charging (transmitting) platform rated at 2A (input) with an effective output of approx. 1A? Likewise, is the wall adapter feeding the pad rated at up to 2A?
2. Is the QI receiving pad mounted on your i317's rear case panel or the battery rated at 1000mA @ 5V? [Anything lower like 700mA will obviously limit your charging rate, which results in longer charging times].
I have not tested the Perseus kernel, but the Saber kernel on my T889 does liberate the higher charging rate as long as the hardware TX/RX pads support it (as covered above). I average about 980mA, which is within 5% of the rated peak for my Samsung charging platform.:good:
Scott-
Click to expand...
Click to collapse
Thanks for the reply.
I have read that the charge rate is limited by the device to something like 466mA. Mine charges at 460. Some say the kernel will raise it the others say the kernel alone won't do it, a hardest mod (like this) is required.
To answer your questions:
1. Transmitter Nokia DT-900 output 750mA.
2. Receiver rated at 650mA
I'd be happy to get the charge rate around 600mA. I know my setup I'd not capable of charging at 900mA, but would like to do better than the 460mA I'm now... getting off possible.
If I can get it charging rate higher, I'd look into getting a 1000mA receiver, but won't be getting a new transmitter until the prices drop. I was able to get several Nokia charges for just under $20a piece.
When I get home I will try Saber kernel.
Sent from my Nexus 7 using Tapatalk 4
Try the custom kernel since that does control the charging manager functions. If the change is marginal, your RX pad may be actually functioning like a 500mA rated pad. Hopefully, you will see something closer to 600-640mA with your setup (even if the charge time will not be significantly reduced). If anything, invest in a good RX pad that is rated up to the 1000mA so you know you are able to accept any level energy up to the 1A regardless of the charger you use now or buy later.
SGBE said:
Try the custom kernel since that does control the charging manager functions. If the change is marginal, your RX pad may be actually functioning like a 500mA rated pad. Hopefully, you will see something closer to 600-640mA with your setup (even if the charge time will not be significantly reduced). If anything, invest in a good RX pad that is rated up to the 1000mA so you know you are able to accept any level energy up to the 1A regardless of the charger you use now or buy later.
Click to expand...
Click to collapse
With Saber Kernel 39.3 I am somehow getting a reading of 899mA, not sure how, as neither my receiver or transmitter should allow this. Though the pic on the FastTech site showed a green receiver pad that had 5v650mA printed on it, the one I got doesn't say anything, so it may be capable of up to 1000mA. Even so my Nokia should not be ab;e to supply enough to charge at the reading I'm getting.
Well, I've always used Perseus Kernel because Saber did not support my 64GB MicroSD card. Flashed Saber Kernel 39.3 and it does support my 64GB MIcroSD, Even the stock saber kernel that comes with Jedi X17 dose not support my exfat 64GB card. So I guess I'm happy.
We'll see what happens. Thanks for the help, it is greatly appreciated!
Very cool! Your charging time should be greatly reduced if eveything is being reported correctly by the app. Please confirm once you test a few cycles.
Sent from my SGH-T889 using Tapatalk 4
SGBE said:
Very cool! Your charging time should be greatly reduced if eveything is being reported correctly by the app. Please confirm once you test a few cycles.
Sent from my SGH-T889 using Tapatalk 4
Click to expand...
Click to collapse
My battery drain seemed to be much greater with the Saber Kernel vs the Perseus kernel I was using. So I went back to the Perseus kernel and even the though it reads 466mA vs Sabers 899mA they both seem to charge at about the same rate (10-12% per hour). I didn't do a real detailed comparison so the Saber my get an extra percent or two per hour, but that means little if it drains the batter much faster.
Maybe I'll try saber 39.3 again to double check the drain and charge times.
on a sad note I cracked my screen again. Luckily the replacement screen price has dropped considerably. I paid less than half of what I did the first time I had to replace it.
SGBE said:
Today, I changed the stock kernel to the Saber variant and I am liking the results. With no others changes or mods, I am now seeing 900mA of the 1000mA the QI receiver pad is spec'd at.
I was also able to charge from 81% to 91% in exactly 25 minutes with the T889 ON, the screen awake, Wi-Fi active, and the phone idling. This would equate to approximately 4 hours and 10 minutes per complete charge, which is very respectable.
Scott-
Click to expand...
Click to collapse
SGBE, thanks for all the info you've provided on this thread. I have a few request, what settings are you using in the saberkernel and where did you go to change these settings? Thanks again!
Sent from my SGH-T889 using xda premium
I'm having problems getting the solder to stick to the resistor up by the charging contacts. I'm afraid if I try much more the resistor will be toast. Can I solder directly to the top contact and it would still work?
I was about to purchase a tablet looking at toshiba exite 7.7, lg gpad, and lenovo yoga 8. I like toshiba for the display the yoga for batteyr life wondering about battery life ob the gpad.
Not so good.
Sent from my LG-V500 using Tapatalk
tclaw said:
I was about to purchase a tablet looking at toshiba exite 7.7, lg gpad, and lenovo yoga 8. I like toshiba for the display the yoga for batteyr life wondering about battery life ob the gpad.
Click to expand...
Click to collapse
I can play about 3 hours straight of Dead Trigger on one charge (high graphics) OR more than 6 episodes of Walking Dead (no comercials so about 45mins per episode and still had some charge left, forgot how much)
ASW1 said:
Not so good.
Sent from my LG-V500 using Tapatalk
Click to expand...
Click to collapse
Helpful one, aren't ya? :laugh:
erwaso said:
Helpful one, aren't ya?
Click to expand...
Click to collapse
Do you want a short or a long answer?
erwaso said:
I can play about 3 hours straight of Dead Trigger on one charge (high graphics) OR more than 6 episodes of Walking Dead (no comercials so about 45mins per episode and still had some charge left, forgot how much)
Helpful one, aren't ya? :laugh:
Click to expand...
Click to collapse
I think his answer was perfect. There's good battery and bad battery, it's as simple as that. X hours of playing/watching X by somebody tells me nothing about a gazillion other factors playing a role (or not). This tab has a not so good battery but it isn't terrible either.
ASW1 said:
Not so good.
Ty for your repsone
Sent from my LG-V500 using Tapatalk
Click to expand...
Click to collapse
erwaso said:
I can play about 3 hours straight of Dead Trigger on one charge (high graphics) OR more than 6 episodes of Walking Dead (no comercials so about 45mins per episode and still had some charge left, forgot how much)
I think go with yoga probally gonna play for fpse and other emulators so cpu/gpu doesnt have to be a beast.
Helpful one, aren't ya? :laugh:
Click to expand...
Click to collapse
lol
ASW1 said:
Do you want a short or a long answer?
Click to expand...
Click to collapse
I think got the answer i was looking for
android404 said:
I think his answer was perfect. There's good battery and bad battery, it's as simple as that. X hours of playing/watching X by somebody tells me nothing about a gazillion other factors playing a role (or not). This tab has a not so good battery but it isn't terrible either.
Click to expand...
Click to collapse
I think your right i was going for longer battery life in the end. the yoga suppose to get 10 hr of video time or more bet gamming probally be 6-8 hour range of midlle of the road games.
I use it for reading, web browsing, email and such. The occasional puzzle game or chess. I'm getting about 3 days out of it. So it's not great but as I use it at home and the charger is always close by, screen resolution was the more important point for me. Before I had the yoga for about 2 weeks before exchanging it for the g pad. The yogas battery life was just amazing compared to the g pad, but the screen has 100 ppi less, which made reading really straining to the eyes in my case. Just to give one of hopefully many examples that can help you decide based on your own usage patterns.
erwaso said:
I can play about 3 hours straight of Dead Trigger on one charge (high graphics) OR more than 6 episodes of Walking Dead (no comercials so about 45mins per episode and still had some charge left, forgot how much)
Click to expand...
Click to collapse
I've rooted and stuck Madhi rom (4.4.2) on mine. Using it for nearly a week now, but the battery isn't brilliant. Had a China 8" tab before which also wasn't good with battery (and an awful ppi for that matter).
Roughly my usage is similar, or just a bit better than erwaso's. I get a bit more on certain things like video playback as I've optimised the Build.prop to save on battery. Also I've used an app to stop all major wakelocks and throttled the cpus to run slightly lower clock speed and on demand.
Personally I don't like any Apple products, but you can't beat there tablets for battery life; Android is too power hungry!
Mikegrmn said:
I use it for reading, web browsing, email and such. The occasional puzzle game or chess. I'm getting about 3 days out of it. So it's not great but as I use it at home and the charger is always close by, screen resolution was the more important point for me. Before I had the yoga for about 2 weeks before exchanging it for the g pad. The yogas battery life was just amazing compared to the g pad, but the screen has 100 ppi less, which made reading really straining to the eyes in my case. Just to give one of hopefully many examples that can help you decide based on your own usage patterns.
Click to expand...
Click to collapse
Yea, i don't read i never learned lol no i don't read anything other than an article on the web. I wanted it for movies and light gaming via fpse and other emulator's so the yoga's long battery life was what i was going for. The toshiba has slightly better cpu and screen but the battery life reports on review varies alot 200 250 was my price range also considering the nook hd plus get decent battery life on the reviews and has full hd screen. I
'm leaning towards that 180 i can pick it up tomorrow. The other's i have to order online. Ty for your feedback.
I also saw microsoft store has dell venue 8 pro on sale for 230 it can play older pc games like cod modern warfare and such it get 8hr battery life. Its the last tablet for me for awhile my wifes cutting me off bounced around tablet to tablet and phone to phone. lol so i'm sure as soon i buy 1 I'm going to wish i would bought a different 1 lol
Has anybody started working with moto 360? How accurate is the watch?
I ran with it the other night. From a steps perspective it's count was comparable to another pedometer I have, so I'd say it was pretty accurate.
Sent from my Nexus 5 using XDA Free mobile app
The background heart rate monitor is really bad though. I really liked that it encouraged you to do 30 mins of vigorous activity a day. I thought it would really help me get in better shape.
The pedometer I agree is fairly accurate and is not off by more than 200 with my LG G3. I try to take 8,000-10,000 steps I take a day so being off by 200 is very good in my books. But as far as keeping track of my daily exercise, there are 2 issues with this... The first is it would be nice to change my vigorous activity goal to 45 mins, 1 hour, or any other value. 30 minutes is recommend by the American heart association but I am sure that trying to double that goal daily would be even better. This issue should be fixed very easily and hopefully soon too. The other and more important issue is how bad it is at keeping track of your exercise time.
Here is how my day went: I woke up and after I put my watch on, I WALKED downstairs and ate a quick breakfast and hopped in my car for work. Then I took an ELEVATOR to the 7th floor and SAT down in my office. After an hour or so of sitting there, I was alerted that I was halfway to my goal for the day or rather Motorola's goal for me... I can pretty much guarantee my heart rate did not go over 90-100 beats the whole time yet somehow the 360 gave me credit for 15 minutes of vigorous exercise. I hope that this like the first issue is a software issue. If it is a hardware issue which it easily could be, the background heart rate monitor is completely useless until the moto 360 2 or whatever they want to call it comes out.
The other weird thing is when I request for my heart rate to be measured, it seems fairly accurate. I don't have any experience with other smart watches but I know this one is not bad. I changed it so that it uses google's app to measure my heart rate which gives me hope that it is just a software issue with the background heart rate monitor and that it will be fixed.
I still think I am going to be keeping the watch for the time being despite the crappy processor, sub par battery, and bad background heart rate monitor. I am excited about android 5/ L/ Lemon Meringue Pie/ Lollipop. I think the software upgrades will make this very good looking and ok performing watch into something of a beast.
Playing Candy Crush for 22 hours straight can take its toll on your battery (and your marriage). Rate this thread to express how the Huawei Mediapad M3's battery performs under heavy use. A higher rating indicates that it lasts a long time even when playing games, streaming video and audio, and doing other CPU-intensive activities.
Then, drop a comment if you have anything to add!
Heavy usage always impacts the battery. I get a few hours when I play games and it is good enough for me.
tmihai20 said:
Heavy usage always impacts the battery. I get a few hours when I play games and it is good enough for me.
Click to expand...
Click to collapse
I have a question.I'm planing to get the m3 10.1 lte.What are u meaning for "heavy usage" ? I will play with games (gameloft) or as syberia so I dont think they are realy heavy (with a note 4 syberia 2 I was able to play with 3-4 hours) but olso streaming tv or divx.At the same time I think to get a SAMSUNG SM-T819 10.1 lte (but not sure with will habe the bettar performance with battery life /streaming .What do u think? Huawei or samsung?
@helen2: I only recommend the 8 inches Mediapad M3. The 10 inches Mediapad (Youth or Lite) have issues with touchscreen and are not as smooth as the 8 inches M3. You could do a lot with the 8 inches ME. It will be easier to carry around, performance is still very good. If you want an alternative, the S2 is not a bad choice. Lenovo Tab 4 8 Plus is also quite good. I do not know having a 10 inches tablet is a must. I had a 10 inches tablet in the past and I find 8 inches to be the perfect size for watching anything on it and still small enough to carry around.
Heavy usage means graphical intensive games like Mortal Kombat X, Real Racing 3, shooters and so on.
@tmihai20 ok no no i dont llike lenovo .Yes I was thinking to get the m3 8.4 inches but..I must look in the shp I have a note 4 5.5 so maybe 8.4 wll be to small,I think the best is 9.7 that I'm meaning using it as a alternative of a netbook.I will have a loook tomorow in shop if the have the 8.4
Thank's
@helen2: it is a shame you do not want a Lenovo tablet, because some people I know have returned the M3 in favor of the Tab 4 8 Plus.. It is also recommended all over Reddit as an alternative to M3. I have not used any Samsung tablets, but the model you specified (SAMSUNG SM-T819) is from 2016 and may not get updates past Nougat.
Huawei also announced Mediapad M5, that will have Quick Charge, the only thing missing from the M3. The SoC will be HiSilicon Kirin 960s, not the Kirin 970, it will not have audio jack and it will have a Type C connector. It may be worth waiting for the M5, if you do not mind the missing audio jack.
tmihai20 said:
@helen2: it is a shame you do not want a Lenovo tablet, because some people I know have returned the M3 in favor of the Tab 4 8 Plus.. It is also recommended all over Reddit as an alternative to M3. I have not used any Samsung tablets, but the model you specified (SAMSUNG SM-T819) is from 2016 and may not get updates past Nougat.
Huawei also announced Mediapad M5, that will have Quick Charge, the only thing missing from the M3. The SoC will be HiSilicon Kirin 960s, not the Kirin 970, it will not have audio jack and it will have a Type C connector. It may be worth waiting for the M5, if you do not mind the missing audio jack.
Click to expand...
Click to collapse
Crazy decision by Huawei, leaving the Audio Jack out on a tablet. :crying:
@Masteryates: one of the dumbest decisions. But hey, why not force everybody to use the damned adapters, so they can damage their tablets more easily and make them buy new ones? There is absolutely no justification to remove audio jack on a tablet.
tmihai20 said:
@Masteryates: one of the dumbest decisions. But hey, why not force everybody to use the damned adapters, so they can damage their tablets more easily and make them buy new ones? There is absolutely no justification to remove audio jack on a tablet.
Click to expand...
Click to collapse
Agreed Tmihai, :highfive:
In many ways it could be argued that Tablets should have 2 headphone jacks to allow 2 people to watch when your on a train etc. You can get an analogue audio splitter but doubt you can get similar for USB-C.
My btvdl09 lasts abou 5 to 6 hours
Well I am honestly shocked by this. I sort of expected the S20 Ultra not to meet my standards but not the Note 20 Ultra! Decided to stick with my Note 10+ for a bit until maybe Z Fold 2 or Note 20 Ultra drops in price or for another who knows how long.
Anyways thought I would share with you guys why I decided to stick with Note 10+ over the Note 20 Ultra.
First off the Note 20 Ultra is nowhere near a bad phone, much of an improvement over the S20 Ultra which I thought was junk. The main highlights of the device just didn't seem worth $1300+ to me.
Screen overall is literally just a hair bigger and actually a hair less vibrant and saturated then the Note 10.
Screen refresh rate still not sure why this is such a huge deal. Can you tell a difference? Yes slightly depending on what you are doing. Sort of like 4k on a tablet or small laptop really just not worth it imo
Cameras, oh yes the cameras. Glad to see the autofocus issue was fixed from the S20 Ultra and photos are great no matter what camera but the main sensor unless you are using the 108 mp for extreme detail and then you lose HDR look very similar to the Note 10. Only huge difference is the zoom lens on the Ultra and it is a big one. 5x zoom looks great and even 10x in most cases looks very good and usable. This is the only thing that made me actually want to keep the 20
Everything else is pretty much the same again as Note 10+ nothing else worth mentioning really
Few photos of the differences in whites and camera bumps and the pretty much same screens
Note 10+ physically looks a lot better to me.
denism81 said:
Note 10+ physically looks a lot better to me.
Click to expand...
Click to collapse
Yeah I was reading reviews and wasn't pleased with what Samsung did.
The variable refresh rate is cool especially if it save battery, kudas there.
The cam has laser assist AF lock too I believe, well done.
5 G, good.
The fastest Snapdragon yet yields a real performance boost although the 10+ never seems slow.
The bad, price tag is through the roof for the 512gb model.
This Note is even harder to protect than the 10 due the cam hump. That also means it really needs a case. Reports of no factory screen protector, not good especially since you'll want to lay it face down because of the cam hump.
Doesn't hardware support AptX HD bluetooth, very disappointing.
Not enough gain to replace the Note 10+ however a good upgrade maybe for a Note 8 or older.
However the Note 10+ is still a viable option especially if you want Pie.
In 2 months my Note 10+ will be a year old and it's still looking great, running strong in fact better then it ever has. Truth be told I'm still learning to use many of its features... not bored or tired of it by a long shot.
Samsung gave nothing much new for HD audio in the 20; AptX HD should have been present... a 3.5 mm jack been nice too.
Samsung needs a top shelf flagship model to get many people to upgrade from the 10+; this isn't it.
Add to that the world economy is a mess.
Samsung should have tried harder and catered more to the performance crowd to set this new Note further apart from the 10+.
Part of the reason for the fail is Samsung doesn't listen to its customers very well. Oh well... I'll wait.
On my Note 10+ I want:
A) Better battery.
B) Better fingerprint sensor. I see new pixels keep them to the back side. Well done Google.
C) Get rid of the silly front camera hole.
As long as these don't change, there is no reason to upgrade for me. Software updates, unnecessary camera and screen changes does not attract me. Especially with that price tag.
Speaking for those of us who get the Exynos chipset... Samsung sucks balls...
Thery are selling INFERIOR hardware for the same price, this relegates the 120Hz refresh rate to HD+ only and NOT to UHD, not because the phone can't handle it but because the Exynos chipset can't. They can't give part of the world this and leave everyone else out in the rain as that would highlight the inadequacies of the Exynos.
Exynos throttles, is a bigger drain on battery, it alters the picture quality of photo's taken and Samsung have the balls to charge us the same as the Snapdragon 865+ chipset...
For this the can shove the Note 20 Ultra, big camera bump and all, where the sun don't shine...
The price is just pure GREED! Useless money spent on creap publicity!
Any "high end" phone with Exynos cpu is ****.
However 120hz screen is fine. For me. Don't care much for qhd+/8k bull****. On a phone screen ?!?
I like Note 10+ 5g for the square screen (very rare these days) and the design. Only minus for me is that the display could be at least 90hz.
A good thing on 10+ and 20 ultra is sd card slot.
I will never buy 20 ultra even if the price will be 500 euros with exynos cpu!
denism81 said:
Note 10+ physically looks a lot better to me.
Click to expand...
Click to collapse
Maybe enough returns will get Samsung's attention to listen to what their customers what... and don't want.
The large storage is great but not without easy 24 bit audio output. The 3.5 mm jack, there's room for it and spend the extra few cents for the bt chipset that supports AptX-HD.
Sad because this fix was easy and cheap to do.
This very expensive phone should have the best and latest chipsets in it, it doesn't. A locked bootloader doesn't give me any thrills either, Samsung Pay and Knox grrrr.
Fail, again.
As for the cams, at this price point buying a dedicated Canon for shooting makes more sense; much better interchangable optics and dedicated AF/image processors.
I use my 10+ more like a laptop than a cam...
That cam hump sucks and I see it as a major liability from a damage stand point. The Note 10+ is hard enough to protect, the 20 U is far worse.
Then there's wittle Bixby... other than it's cam smart functions it's completely worthless to me and a huge privacy invasive.
Wearables have the same privacy issues and need all the permissions under the sun to even load, really?
Samsung is very hard of hearing.
Kudos for jamming their Note 20 U were it belongs...
The only thing I wish Samsung would do for their camera is when you turn off hdr it actually turns off. So ridiculous. Hdr is always on no matter what you do. Besides switching to pro mode.
I'm sticking with my Note 10+. Am very pleased with it and I've only suffered 2% battery degradation in the first year of use. Using it daily 4 to 12 hours screen on time.
I love how thin the device is and the camera bump on the Note 20 Ultra is not acceptable to me.
Agreed, the Note 20 Ultra is a bit too overpriced, $1300 for almost no improvement over the previous year's device? No thanks.
Raydianze said:
I'm sticking with my Note 10+. Am very pleased with it and I've only suffered 2% battery degradation in the first year of use. Using it daily 4 to 12 hours screen on time.
I love how thin the device is and the camera bump on the Note 20 Ultra is not acceptable to me.
Click to expand...
Click to collapse
How is it possible to measure that? Battery degradation, I've been curious lately about that
TonyGzl92 said:
How is it possible to measure that? Battery degradation, I've been curious lately about that
Click to expand...
Click to collapse
Using AccuBattery. Installed it first thing when I bought my Note 10+
Raydianze said:
Using AccuBattery. Installed it first thing when I bought my Note 10+
Click to expand...
Click to collapse
When you first set it up, the battery estimate it gives is from the Android system's battery degradation estimates. I have another app that can see it as well.
It's overlay ma meter is useful.
Raydianze said:
Using AccuBattery. Installed it first thing when I bought my Note 10+
Click to expand...
Click to collapse
Seriously bro?
Accubattery isn't accurate at telling you the battery health on current devices. Its a known common fact on here.
Limeybastard said:
Seriously bro?
Accubattery isn't accurate at telling you the battery health on current devices. Its a known common fact on here.
Click to expand...
Click to collapse
I consider it more a battery charging tool.
It's useful as a charge alarm, for it's milliamp overlay usage and battery temp info.
It's charge history is useful unfortunately it's highest resolution is in minutes.
I divide the total about of milliamps absorbed during the charge cycle by the time it took to gauge battery health. I consider anything above 85 [email protected] good in the 30-70% range.
As the battery degrades I expect this value will decrease.
Lol, the phone's battery life estimate isn't any better.
The app is glitchy, it stops recording charge history.. A reload every now and then solves that. This would make it's long term wear estimates inaccurate even if it's wear curve is accurate.
Short term though it's wear graph gives a good comparative indication of how much you're degrading the battery.
Graphically illustrates why you don't want to charge above 80% or go below 30% very often.
blackhawk said:
I consider it more a battery charging tool.
It's useful as a charge alarm, for it's milliamp overlay usage and battery temp info.
It's charge history is useful unfortunately it's highest resolution is in minutes.
I divide the total about of milliamps absorbed during the charge cycle by the time it took to gauge battery health. I consider anything above 85 [email protected] good in the 30-70% range.
As the battery degrades I expect this value will decrease.
Lol, the phone's battery life estimate isn't any better.
The app is glitchy, it stops recording charge history.. A reload every now and then solves that. This would make it's long term wear estimates inaccurate even if it's wear curve is accurate.
Short term though it's wear graph gives a good comparative indication of how much you're degrading the battery.
Graphically illustrates why you don't want to charge above 80% or go below 30% very often.
Click to expand...
Click to collapse
Indeed. As bad as some of it's functionality is, I still use it and have done so since my Note 4 days. This and GSAM are normally the first two apps that get installed on any new Android device that I use.
Limeybastard said:
Indeed. As bad as some of it's functionality is, I still use it and have done so since my Note 4 days. This and GSAM are normally the first two apps that get installed on any new Android device that I use.
Click to expand...
Click to collapse
Going to start unplugging the charger right after a battery percentage point flips to try to get better than a 60 second resolution for the charge history.
Lol, the first app I install is the package disabler.
Going to try Gsam, Thanks.... see how well it's battery tracker does.
blackhawk said:
Going to start unplugging the charger right after a battery percentage point flips to try to get better than a 60 second resolution for the charge history.
Lol, the first app I install is the package disabler.
Going to try Gsam, Thanks.... see how well it's battery tracker does.
Click to expand...
Click to collapse
Just make sure to remove it off battery optimization. Similar to accubattery pro.
Limeybastard said:
Just make sure to remove it off battery optimization. Similar to accubattery pro.
Click to expand...
Click to collapse
The only things I have toggle in Device Care are Optimize for power setting and fast charging.
Then I disable Device Care.
Using the old factory load Pie version which has 360° on it. I use it's cache cleaner as it cleans well but I put in airplane mode first.
All buckets show as active in standby apps; no power management is active other than the embedded Android ones. Runs great