Related
i saw in google pixel specs that include EIS 2.0 " electronic image stabilizer " , and does not include OIS " optical image stabilizer " , so i have nexus 6p is it mean that i have old version of EIS ?
what is the version of EIS in nexus 6p and what is the deference between two and can recorde 4k with EIS ?
the answer will decide to buy pixel or not , because video recording important to me .
thank you in advance
ali8383 said:
i saw in google pixel specs that include EIS 2.0 " electronic image stabilizer " , and does not include OIS " optical image stabilizer " , so i have nexus 6p is it mean that i have old version of EIS ?
what is the version of EIS in nexus 6p and what is the deference between two and can recorde 4k with EIS ?
the answer will decide to buy pixel or not , because video recording important to me .
thank you in advance
Click to expand...
Click to collapse
From what I've gathered the stabilization in the 6p is purely software based.
On the pixel however, Google has tied the camera to the gyroscope. The gyroscope polls 200 times a second to stabilize the image. So while it doesn't have Ois, it's not just software on the pixel.
scandalousk said:
From what I've gathered the stabilization in the 6p is purely software based.
On the pixel however, Google has tied the camera to the gyroscope. The gyroscope polls 200 times a second to stabilize the image. So while it doesn't have Ois, it's not just software on the pixel.
Click to expand...
Click to collapse
tigercranestyle said:
^^^ what this guy said, though i thought i heard it polled the gyroscope 2000/second. looked around, but can't remember where i read/heard it. but yeah, @ali8383, 6p is strictly software based while pixel is sw/hw.
also the nexus 6p couldn't use eis to record 4k. the pixel can.
Click to expand...
Click to collapse
Thank you for explaination
Could the 6P not poll its gyro also given the software?
B3501 said:
Could the 6P not poll its gyro also given the software?
Click to expand...
Click to collapse
Probably doesn't have the CPU power to handle everything needed... kind of how HDR+ is way better on the Pixels. That, or Google is pulling shady moves and purposely hindering past devices to push new product.
I don't know what they are using for stability, but I did notice the pictures from the Pixel phones were much more sharper and detailed. Check out this video I made of a real world camera test on youtube. I got to play with the actual phones a few days before they came out and this was the first things i checked out. Just google techplughd. Thanks
This might help (go to minute 28)
https://www.dpreview.com/news/9782565306/google-launches-pixel-and-pixel-xl-smartphones
EDIT...the video in the link isn't set to the right time, I will tell you what time the video stabilization is shown.
4redstars said:
This might help (go to minute 28)
https://www.dpreview.com/news/9782565306/google-launches-pixel-and-pixel-xl-smartphones
EDIT...the video in the link isn't set to the right time, I will tell you what time the video stabilization is shown.
Click to expand...
Click to collapse
thank you i watched the video again and understand now how it works .
Even when it seems a nice feature, the lack of OIS is still a sin in 2016 and for the price they pretend to charge.
Besides this, pinging the gyroscope 200 times per second is still more expensive (in terms of processing) than just add the proper hardware.
I don't believe this kind of stabilization could be better than normal EIS, so I'm staying skeptical until I see real conditions videos.
Here's a really good explanation of OIS vs EIS and being a current Nexus 6p user lowlight has been phenomenal so I'm excited about the Gyroscope and don't even care about not having OIS really.
https://9to5google.com/2016/10/10/g...firms-that-eis-will-still-work-with-4k-video/
I have a guess why no OIS, think this , without a OIS Gyroscope data perfectly match how camera lens moves, and software can pull the data out to correct the image, with OIS, the data from Gyroscope doesn't match the lens move any more, the EIS can only use the data from the camera to do stabilization which is less effective (cost more CPU and worse result). Some prople may argue OIS hardware can do the work, to be honest, OIS can offset some hand shake during low light taking pics, but during video recording, that little OIS can offer very little help smooth out the image, which actually not worth losing the ability to actually use Gyroscope to correct the image which can create more stable image. and Consider the pixel size of the camera is very large, much larger than even note 7, the low light shutter speed is actually fast enough so OIS really can't make much difference here. I use GS7 and I do notice taking low light pics take longer expose time, but google claim the pixel phone doesn't, which proves what I am guessing here. Let's see some real life test before jump to a conclusion, OIS is good, unless it is a big rig or on a big camera. On a phone, we just pick whatever works.
Does the Pixel have any sort of non-software based image stabilization for photos? (Gyroscope stabilization has only been mentioned for videos).
4redstars said:
Here's a really good explanation of OIS vs EIS and being a current Nexus 6p user lowlight has been phenomenal so I'm excited about the Gyroscope and don't even care about not having OIS really.
https://9to5google.com/2016/10/10/g...firms-that-eis-will-still-work-with-4k-video/
Click to expand...
Click to collapse
jeffonion said:
I have a guess why no OIS, think this , without a OIS Gyroscope data perfectly match how camera lens moves, and software can pull the data out to correct the image, with OIS, the data from Gyroscope doesn't match the lens move any more, the EIS can only use the data from the camera to do stabilization which is less effective (cost more CPU and worse result). Some prople may argue OIS hardware can do the work, to be honest, OIS can offset some hand shake during low light taking pics, but during video recording, that little OIS can offer very little help smooth out the image, which actually not worth losing the ability to actually use Gyroscope to correct the image which can create more stable image. and Consider the pixel size of the camera is very large, much larger than even note 7, the low light shutter speed is actually fast enough so OIS really can't make much difference here. I use GS7 and I do notice taking low light pics take longer expose time, but google claim the pixel phone doesn't, which proves what I am guessing here. Let's see some real life test before jump to a conclusion, OIS is good, unless it is a big rig or on a big camera. On a phone, we just pick whatever works.
Click to expand...
Click to collapse
There is no "data" from OIS, is just a mechanical system in order to compensate any movement made bu the user. The compensation is immediately and there is no need to process anything, and that's why is the preferred for most of the people. Besides, OIS help a lot with low light pictures and even when the Nexus 6P was really capable, the addition of OIS could have make a formidable experience in camera.
https://youtu.be/l5d2F6nP5MY?t=25s
EIS can't help with pictures, is only used for video, and even when it does somehow the job, the results are not so good, and it tends to have a lot of jelly effect. When you have OIS available, you can also make it work in conjunction with EIS and the results are awesome. Another point for the OIS is that it works with all resolutions, while EIS is dependant on the resolution and the processing power.
You can think this: best smartphone's cameras are the ones which include OIS, and they present really decent results even in low light. OIS helps you both in photo and video, while EIS is only for video.
I changed some months ago from a phone with OIS to one that doesn't have it, and I can say it's a world of difference in detail, even when the second one has better camera in paper, and when you mix the OIS and EIS, you get a really nice stabilized video without having to sacrifice much.
sabesh said:
Does the Pixel have any sort of non-software based image stabilization for photos? (Gyroscope stabilization has only been mentioned for videos).
Click to expand...
Click to collapse
Exactly my point, Google is presuming about its new camera and its new stabilization, but most of the people take more photos than videos the whole time, and as far as I know, OIS is the only way to proper "stabilize" when taking pictures. Besides this, I would love to see manual controls and long exposure in this camera to see how good it does considering the lack of OIS and see if it's on pair with other smartphones.
Galaxo60 said:
Even when it seems a nice feature, the lack of OIS is still a sin in 2016 and for the price they pretend to charge.
Besides this, pinging the gyroscope 200 times per second is still more expensive (in terms of processing) than just add the proper hardware.
I don't believe this kind of stabilization could be better than normal EIS, so I'm staying skeptical until I see real conditions videos.
Click to expand...
Click to collapse
But think about it. OIS is usually requested due to it performing better in low light conditions and stabilize the video (it's not to prevent blurry pictures). Google opted to go with a larger sensor that has a larger pixels, which in turn offer much better performance in low light. They then stabilized the camera with the gyroscope to prevent the jelly effect during recording. It's just a different take on the camera that will probably work just as well. Even better maybe.
Google has stated that the camera has a special core dedicated to it. Meaning processing power isn't lost at all.
scandalousk said:
But think about it. OIS is usually requested due to it performing better in low light conditions and stabilize the video (it's not to prevent blurry pictures).
Click to expand...
Click to collapse
Wrong, OIS help you a lot by taking pictures in low light condition with a long exposure and prevent the blurry pictures, and that's why is a really nice adition to have.
Galaxo60 said:
Wrong, OIS help you a lot by taking pictures in low light condition with a long exposure and prevent the blurry pictures, and that's why is a really nice adition to have.
Click to expand...
Click to collapse
Longer exposure time means that the camera is able to capture more light... Guess what else captures a lot more light? The large 1.55 micron pixels that the pixel phone has.
Taking pictures in the dark results in more noise, not blurred pictures perse.
With the f2.0 aperture, the pictures will have less depth vs a f1.7/1.8 aperture. Is that correct?
Sent from my Nexus 6P using Tapatalk
scandalousk said:
Longer exposure time means that the camera is able to capture more light... Guess what else captures a lot more light? The large 1.55 micron pixels that the pixel phone has.
Taking pictures in the dark results in more noise, not blurred pictures perse.
Click to expand...
Click to collapse
I agree with you in these points, but the Nexus 6P has the same camera and still produces some unexpected results time to time, so if Google nailed it with this, I think many people would be happy.
This is some test in low light, and it seems focus is still messed:
https://www.youtube.com/watch?v=RbLZq52fVQM
Galaxo60 said:
I agree with you in these points, but the Nexus 6P has the same camera and still produces some unexpected results time to time, so if Google nailed it with this, I think many people would be happy.
This is some test in low light, and it seems focus is still messed:
https://www.youtube.com/watch?v=RbLZq52fVQM
Click to expand...
Click to collapse
The Nexus 6P does not use the same camera as the Pixel phones. It's a different sensor. Although both phones have 1.55 micron pixels. The Nexus 6P also doesn't use any hardware based stabilization like the Pixel phone either.
And while focusing didn't happen in that videos. It's a single instance where OIS wouldn't have made a difference since the Nexus 6P did focus.
The best thing to do is just wait and see. I'm sure Google will give us something stellar.
scandalousk said:
The Nexus 6P does not use the same camera as the Pixel phones. It's a different sensor. Although both phones have 1.55 micron pixels. The Nexus 6P also doesn't use any hardware based stabilization like the Pixel phone either.
And while focusing didn't happen in that videos. It's a single instance where OIS wouldn't have made a difference since the Nexus 6P did focus.
The best thing to do is just wait and see. I'm sure Google will give us something stellar.
Click to expand...
Click to collapse
This looks pretty nice:
https://www.youtube.com/watch?v=1oftbNhz8fU
anyone have success in running google pixels camera on P10 for HDR +, and what other camera apps You use ?
I use Cortex camera for quality slow photography (multi processing) like S8 but much more frames (s8 takes 3 photos and combine them) Cortex use 10 - 100 photos for best quality.
I also use A better camera, but still missing Super sensor option on P10 for best output.
Stock camera use too much sharpening for landscapes and distance objects, but it is fine for near objects...
I've not had any problems with the stock camera and can't get Google pixel camera to install.
Sent from my VTR-L09 using Tapatalk
Google camera apk with hdr enabled
Not working with my mate 9
Anyone having problems with the panorama photos they are crap! The joins are rubbish and any horizon shots I take look like the Manhattan skyline!
Sent from my VTR-L09 using Tapatalk
Looks mostly fine to me when shooting horizontal panoramas. The only vertical panorama I've shot is blurry on one half and I don't like manual switching between shooting vertical and horizontal panoramas. Overall I like samsung's panorama user experience better but I think I'll live
The Huawei Leica camera also use stacking when processing the pictures.
ClausG76 said:
The Huawei Leica camera also use stacking when processing the pictures.
Click to expand...
Click to collapse
Yes but stack bw & color photo? result is fine details but also some noise is present.
The P10 raw image have very high dynamic range. So it can stack them like the Pixel does.
https://youtu.be/Fuqz_iiodSM
Here is my test against Galaxy S8, what You think about camera ? sry for maybe boring music
Video part is recorded on Huawei P10 in 4K..
streetewok said:
Anyone having problems with the panorama photos they are crap! The joins are rubbish and any horizon shots I take look like the Manhattan skyline!
Sent from my VTR-L09 using Tapatalk
Click to expand...
Click to collapse
Yes they are ... Huawei have to improve ?
The panorama is really not good. I hope Huawei will fix it soon.
For now, I have a photo for you all I made with my P10. Nothing edited, shot in manual mode free handed.
Hola
RE: P10 Camera a talk
I'm seeing much better results than Pixel or iPhone 7.
Quiet satisfied with the camera but I hope they will port PanoSphere to their Camera Mode
This is one shot with a really hard light, the phone however did a really great job.
arcangelbelo said:
This is one shot with a really hard light, the phone however did a really great job.
Click to expand...
Click to collapse
You edit it?
A bit of snapseed, I used hdr filter and some detail.
Coguar said:
anyone have success in running google pixels camera on P10 for HDR +, and what other camera apps You use ?
I use Cortex camera for quality slow photography (multi processing) like S8 but much more frames (s8 takes 3 photos and combine them) Cortex use 10 - 100 photos for best quality.
I also use A better camera, but still missing Super sensor option on P10 for best output.
Stock camera use too much sharpening for landscapes and distance objects, but it is fine for near objects...
Click to expand...
Click to collapse
Just casually run into this thread (I own a Mi5s) since I have also looked for alternatives to Google's multiframe hdr+.
On the Mi5s Cortex works wonderfully on low light still subjects. But my favorite app is Snap Hdr Camera on multiframe Nightmode (usually set to 4-8 frames) for those shots that require iso 400-2000, much faster than Cortex, and better than Better Camera on my phone. Incidentally Almalence, Better Camera's developper has licensed its technology to Huawei in the past.
Is there a way to completely disable all the beautification and bokeh options for the front camera? I don't like the effect at all and disabling it every time can't be the only solution.
That's weird. On my phone, it's only on if i left it on the last time I used the camera..
Recently I got my Nokia 6.1, well first thing I did was download Arnova's Google camera and Potter's Night Sight Google Camera.
Maybe it's me buy I wasn't able to find RAW capture in default camera app. Need help with this because I liked stock camera features.
as for Google camera, both cameras with Super photo settings, I noticed high amount of noise. Check photo corners
Both in night sight and HDR+ modes, so I'm sticking with stock camera, and using Gcam HDR+ only for selfies. Disappointed.
In poor light conditions, dslr-s also have noise in the final image, so you can't expect miracles from a sensor, that is as big as needle pin, compared to APS-C or FF sensors, which have way more area for gathering light. Combine that with the poorer high ISO performance, the lack of long exposure time, and the lack of variable aperture - I'm really amazed what the computational technology of GCam ports are capable of doing in low light, HANDHELD!!! Today I made some tests, and with nightmode I got results, that would require a tripod for my camera to make an image with comparable quality.
Stock camera doesn't have RAW support and RAW doesn't "eliminate" noise - on the contrary - in some situations, GCAM copes better than me in editing the noise out and preserving the detail (I do it in lightroom).
So for now, GCam with nightmode is quite a thing, given the limited hardware of every smartphone camera.
bo6o said:
In poor light conditions, dslr-s also have noise in the final image, so you can't expect miracles from a sensor, that is as big as needle pin, compared to APS-C or FF sensors, which have way more area for gathering light. Combine that with the poorer high ISO performance, the lack of long exposure time, and the lack of variable aperture - I'm really amazed what the computational technology of GCam ports are capable of doing in low light, HANDHELD!!! Today I made some tests, and with nightmode I got results, that would require a tripod for my camera to make an image with comparable quality.
Stock camera doesn't have RAW support and RAW doesn't "eliminate" noise - on the contrary - in some situations, GCAM copes better than me in editing the noise out and preserving the detail (I do it in lightroom).
So for now, GCam with nightmode is quite a thing, given the limited hardware of every smartphone camera.
Click to expand...
Click to collapse
Well I am photographer myself and I understand what you said, but in my case even with ISO100 and 3 sec shutter with tripod final image has artifacts not noise. I noticed that even in daylight, somehow after phone
restart I got less of that, so possibly it was software bug.
So far I tested Footej Camera and Momento Pro Camera both of same allow saving in RAW , however they don't support long shutter speed above 1 sec.
FV-5 camera also is good, saves RAW format and also is pretty well.
As for Google Cam, I use it for selfies, they are a lot detailed, for video I recommend Cinema 4K , you get decent quality with 200mbps bitrate (sample)
And software stabilisation in that software is quite decent, make sure you shoot in 1080p for stabilisation to work.
I tested with lightroom camera, and sadly, I should agree with you, but I don't think it's a software bug. It's either the capability of the sensor (reason one for the short maximum long exposure time), or just fast overheating of the sensor (less plausible).
I've seen a comparison between pixel 3 night mode vs a7riii camera - it's amazing, and has nothing in common with the noise in 6.1. And this makes me think of a third reason - as this program is a port, it might have special algorithm for cleaning the noise of the pixel sensor, which doesn't work for ours. This theory can be checked by asking someone to take a comparable long exposure with pixel 3 in dng, with a third party program.
So I noticed that the zoom lens only activates at 5x zoom. You can see this by covering the lens with your finger and zooming in.
I gather this must mean that the optical 4x lens is fixed at 4x, but as it only kicks in at 5x zoom you can never get 4x optical zoom shots as it is cropped to 5x ?
This would also mean that up to 5x zoom you are just getting a digital crop of the main lens?
Am I missing something here? Seems a little weird that you cant get a 4x optical shot with a 4x optical lens fitted.
rosso22 said:
So I noticed that the zoom lens only activates at 5x zoom. You can see this by covering the lens with your finger and zooming in.
I gather this must mean that the optical 4x lens is fixed at 4x, but as it only kicks in at 5x zoom you can never get 4x optical zoom shots as it is cropped to 5x ?
This would also mean that up to 5x zoom you are just getting a digital crop of the main lens?
Am I missing something here? Seems a little weird that you cant get a 4x optical shot with a 4x optical lens fitted.
Click to expand...
Click to collapse
Googles periscope lens doesn't kick in a "fixed" manner, it uses an algorithm (ambient light, shaking, distance to object etc) to decide "on its own" when to kick in the dedicated zoom lens. You can easily test this out/replicate this by lowering the light in your room/and or change your distance to your desired object and see that the point where periscope lens kicks in, changes.
Best just try to focus on the amount of details you can see, if you play around with it, you can quickly see through the Preview when periscope kicks in, since that will improve the amount of details in your preview window significantly.
Many people here want "fixed" values, but as of now, Google doesn't offer that.
This is also a big problem in reviews, since those people don't know about this limitation and you can often see a "here, 4x shot, not looking good" and my eyes tell me "jea, that's a digital crop, you m*ron" and none are the wiser.
So your saying you never know whether you will get a crop or an optical zoom shot at 4x, this makes the 4x lens pointless imho.
Google are basically stopping you using a feature of the phone whenever you want. If every zoom shot is a gamble on cropped or optical, decided by the phone, surely they are stopping you from getting the best performance from the camera every shot/zoom length.
If I choose a 4x shot I want optical every time or I may as well have bought the 6 instead of the pro and saved some money.
The deciding factor on whether it's digital zoom or the telephoto lens is the distance of the object. I estimate the telephoto lens minimum focus distance is around 4 feet. As you go between close and far objects, you can see the stutter between the digital 4x and telephoto 4x. You could always cover the either the main lens or telephoto and figure it out too. What they really need to do is add an icon in the camera app that lets you know it's 4x digital zoom or 4x telephoto.
This isn't something that's only unique to the 6 Pro.
EeZeEpEe said:
What they really need to do is add an icon in the camera app that lets you know it's 4x digital zoom or 4x telephoto.
Click to expand...
Click to collapse
This. I think that's the best solution for now, let people know what they have at the moment.
rosso22 said:
So your saying you never know whether you will get a crop or an optical zoom shot at 4x, this makes the 4x lens pointless imho.
Google are basically stopping you using a feature of the phone whenever you want. If every zoom shot is a gamble on cropped or optical, decided by the phone, surely they are stopping you from getting the best performance from the camera every shot/zoom length.
If I choose a 4x shot I want optical every time or I may as well have bought the 6 instead of the pro and saved some money.
Click to expand...
Click to collapse
The threshold in the HAL is at about 4.3x provided that all the other requirements for the switch are met. You can see the image "jump" slightly when it crosses the threshold because the image characteristics do not match perfectly. So yes, you can definitely tell which sensor it is using.
The reason why they don't alert you to which sensor it is using is because they (gooble) think they're smarter than you and better able to pick the right sensor.
Another thing to keep in mind is if the 4x telephoto was the default, anything too close would be immediately blurry. Then how do you switch to a digital zoom?
They could just give us a way of selecting the lens in use manually. Instead of treating us all like idiots.
rosso22 said:
They could just give us a way of selecting the lens in use manually. Instead of treating us all like idiots.
Click to expand...
Click to collapse
Good luck with that. Truly given the way the entire world is being run right now, its pretty clear that the vast majority of people *ARE* idiots. And that is putting it mildly. So best way to make money is to sell to the majority, who are stupid enough to trust government -- those people clearly can't think for themselves.
rosso22 said:
So your saying you never know whether you will get a crop or an optical zoom shot at 4x, this makes the 4x lens pointless imho.
Google are basically stopping you using a feature of the phone whenever you want. If every zoom shot is a gamble on cropped or optical, decided by the phone, surely they are stopping you from getting the best performance from the camera every shot/zoom length.
If I choose a 4x shot I want optical every time or I may as well have bought the 6 instead of the pro and saved some money.
Click to expand...
Click to collapse
yes, return your pro and get the 6. be happier.
It's worse on video. I took a supposedly 4x video of my dog and it looks terrible. Pixelated, oversharpened and unusable.
MacGuy2006 said:
It's worse on video. I took a supposedly 4x video of my dog and it looks terrible. Pixelated, oversharpened and unusable.
Click to expand...
Click to collapse
Do you have 4k60fps enabled?
For some odd reason, Google only allows the dedicated sensors to kick in if you use 30fps, since both the ultrawide and Tele module do not support 4k60fps.
So if you zoom in whilst having 4k60fps enabled, it will always be a digital zoom.
This is one of the most annoying features. The 4x lense can focus as reasonably close range but the "ai" deciding to either crop and main lense or switch to the periscope is crap.
You can trick it by switching to 4x and focusing on a distant object and back again and hope it's just within range.
People ask why use it so close, but the 4x can really aid in composing a shot without the wide distortions.
Another issue is if you adjust anything like HDR or temperature on the main lense and switch focal length to other options it doesn't switch to the actual lense you want anymore!!
MacGuy2006 said:
It's worse on video. I took a supposedly 4x video of my dog and it looks terrible. Pixelated, oversharpened and unusable.
Click to expand...
Click to collapse
You can't use the telephoto sensor for video. Only the other two.
86rickard said:
This is one of the most annoying features. The 4x lense can focus as reasonably close range but the "ai" deciding to either crop and main lense or switch to the periscope is crap.
Click to expand...
Click to collapse
In my book, that's an idiotic "feature" and is in fact a bug.
It makes zooming in day-to-day use unusable in most common cases.
When I zoomed to 4x and took a video, the result (1080) was not usable. It's a joke.
I think the conclusion here is the entire camera system really needs work. The hardware is better but the software is letting it down. Jerky transitions, bugs and a processing algorithm left over from the lower quality sensor days that actually over works the 50mp images.
96carboard said:
You can't use the telephoto sensor for video. Only the other two.
Click to expand...
Click to collapse
No, you can. I just tested it by setting it to 4x, making sure I'm focused on something far enough, and blocking the telephoto lens does block the picture.
Just discussing this elsewhere, with regard to the MWP GCam mod. This is a *serious* improvement over the stock app; the one thing they could do to make it perfect is give you control over when the periscope kicks in. If anyone here knows the people working on this, please mention this. The mod is here, btw — you really should try it! The version you want is 8.3.252-V1c_MWP:
MWP GCam APKs - Google Camera Port
Modified Google Camera app by MWP.
www.celsoazevedo.com
Gnaius said:
Just discussing this elsewhere, with regard to the MWP GCam mod. This is a *serious* improvement over the stock app; the one thing they could do to make it perfect is give you control over when the periscope kicks in. If anyone here knows the people working on this, please mention this. The mod is here, btw — you really should try it! The version you want is 8.3.252-V1c_MWP:
MWP GCam APKs - Google Camera Port
Modified Google Camera app by MWP.
www.celsoazevedo.com
Click to expand...
Click to collapse
That may not actually be an option. The hal only presents one logical camera on that side of the phone and transparently switches between the sensors based on the level of zoom requested, and an assortment of other data.
I haven't looked into what the hal looks like, is it open source? If it is, presumably it could be modified to present 3 or 4 logical cameras (I.e. 3 individual + 1 combined).
Gnaius said:
Just discussing this elsewhere, with regard to the MWP GCam mod. This is a *serious* improvement over the stock app;
Click to expand...
Click to collapse
How is it an improvement?
Testing zoom, it still uses the main lens on 1x, 2x ad 4x. Just like the stock camera does.
This is another reason to dislike this phone and to recommend against buying it.
Hi all. I am debating the switch from the regular 6 to the pro.
Looks I will be getting bigger higher refresh screen (pro) which is curved (con) and with worse battery life.
One thing that is not clear to me - is the zoom lens used in any other situations than when actively selecting it? For example - which of the lenses is being used when you select the portrait mode in the camera app? Is it using the telephoto lens to battle the fish eye effect?
monocay said:
Hi all. I am debating the switch from the regular 6 to the pro.
Looks I will be getting bigger higher refresh screen (pro) which is curved (con) and with worse battery life.
One thing that is not clear to me - is the zoom lens used in any other situations than when actively selecting it? For example - which of the lenses is being used when you select the portrait mode in the camera app? Is it using the telephoto lens to battle the fish eye effect?
Click to expand...
Click to collapse
I have never gotten the telephoto lens to show up in gcam. Only the 50MP wide cam and the 12MP ultra wide cam. The 50 gives you zoom options but the 48MP telephoto never shows up
monocay said:
Hi all. I am debating the switch from the regular 6 to the pro.
Looks I will be getting bigger higher refresh screen (pro) which is curved (con) and with worse battery life.
One thing that is not clear to me - is the zoom lens used in any other situations than when actively selecting it? For example - which of the lenses is being used when you select the portrait mode in the camera app? Is it using the telephoto lens to battle the fish eye effect?
Click to expand...
Click to collapse
It's a 4x telephoto, which is a lot for portraits (so portraits are taken using main camera at 2x magnification).
No, it's not being used for enhancing ultrawide. I suppose it's also because if the same reason mentioned above (OnePlus uses it's 2x tele for this as well).
Telephoto is really only useful for zooming 4x and up all the way to 20x, when photos are still kinda useful (depending on a scenario), thanks to being a 50 MP sensor and some Google's AI magic.
I take a lot of zoomed pictures and I don't mind curved display. Battery is okay, 6 hours of SOT are real for me, usually on social networks and news scrolling. Sometimes I play a League of Legends, than it will drop to 4-5 hours.
Zoom lens is for... zooming in. I would think that would be self-explanatory?
It kicks in when the magnification reaches about 4.5x. Switching between sensors is exclusively a matter of changing the magnification level, not by "selecting" a particular camera. There are other details considered before switching sensors, such as exposure requirements.
You can tell which sensor is in use by briefly blocking sensors with your finger. When you see your finger on the screen, you found the sensor that is in use.
Yeah, thanks
What was not clear to me was if it is being used when shooting portrait photos, as some phones (e.g. Oneplus) do.
Looks like this is not the case.
I will stick with my vanilla 6 then, not missing out on much, subjectively.
monocay said:
Yeah, thanks
What was not clear to me was if it is being used when shooting portrait photos, as some phones (e.g. Oneplus) do.
Looks like this is not the case.
I will stick with my vanilla 6 then, not missing out on much, subjectively.
Click to expand...
Click to collapse
Portrait photos can be made with ANY lens, depending on the distance and direction to the subject(s).
96carboard said:
Portrait photos can be made with ANY lens, depending on the distance and direction to the subject(s).
Click to expand...
Click to collapse
Is it so? How does the portrait UI looks on Pixel 6 Pro? On the vanilla pixel I have the options to shoot it in 1x and 2x only. Does the pro version offer a higher magnification, using the zoom lens instead of main one?
monocay said:
Is it so? How does the portrait UI looks on Pixel 6 Pro? On the vanilla pixel I have the options to shoot it in 1x and 2x only. Does the pro version offer a higher magnification, using the zoom lens instead of main one?
Click to expand...
Click to collapse
No. 1x and 2x are the only options available. The telephoto lens isn't used in portrait mode.