Esato

Forum > General discussions > Rumours > Sony Xperia Rumors 2015

Previous  123 ... 212223 ... 315316317  Next
Author Sony Xperia Rumors 2015
amirprog
W800
Joined: Aug 22, 2013
Posts: > 500
From: Israel
PM
Posted: 2015-01-07 17:52
Reply with quoteEdit/Delete This PostPrint this post
A new juicy Z4 (shown in closed doors at ces) leak from ePrice, pretty reliable site as far as i remember) (google translate): http://www.eprice.com.tw/mobile/talk/4551/4924586/1/
phonearena translation: http://www.gforgames.com/gadg[....]a-z4-two-flavors-2k-fhd-46041/
waterproof usb port (certainly ip68), glossy metal frame ("the elemental" may be the real thing), some markets get 2k, some get 1080p (that i don't understand), blue color version, slim and lightweight chassis maintained without sacrificing battery. a positive news overall.
[ This Message was edited by: amirprog on 2015-01-07 16:55 ]
ascariss
Sony Xperia Z3
Joined: Apr 06, 2013
Posts: > 500
PM
Posted: 2015-01-07 18:13
Reply with quoteEdit/Delete This PostPrint this post

On 2015-01-07 08:12:51, Xajel wrote:
@HxH

My points were from previous Tegra SoC's, it might be different this time as NVIDIA did the Maxwell cores very well. but is this enough for a mobile SoC ? the benchmarks only showed performance and all previous Tegra SoC's showed this first with amazing results.. but later when real-world battery life comes there was disappointments...

I was mistaken by the 8x A53 cores config as I saw that on two websites and thought this is the real thing... but now it's correct as you said it's actually a big.LITTLE configuration with 4x A57 + 4x A53 which is nice...

The problem bro is that 10W is still very high for a mobile phone SoC, and the 1TFlops comes mainly from the GPU cores of the SoC but the general ARM core will have something also to share with this 1TFlops.

According to notebookcheck.com the Snapdragon 801 is using 3-4W depending on configuration, while the Snap 805 uses 3-5W depending on the configuration. so the X1 is more than double 801 and double 805 in it's maximum power.

And the last thing you already mentioned, the lack of modem.. and this will mean increased cost not to mention that more power will be needed to power both the SoC and the modem, while the 801 + 805 already has a modem and the power is total of it...



Here is a comparison from anandtech, again running Manhattan 1080p (offscreen) with X1’s GPU underclocked to match the performance of the A8X at roughly 33fps.

http://www.anandtech.com/show/8811/nvidia-tegra-x1-preview/3

NVIDIA’s tools show the X1’s GPU averages 1.51W over the run of Manhattan. Meanwhile the A8X’s GPU averages 2.67W, over a watt more for otherwise equal performance. This test is especially notable since both SoCs are manufactured on the same TSMC 20nm SoC process, which means that any performance differences between the two devices are solely a function of energy efficiency.


Not saying this is a super scientific test but I feel the 20nm and the arm cores help in power drainage. I feel the target for this chip will not be phones but tablets of course.

@amirprog, I knew the Z4 would be backstage at CES, hopefully something leaks, I am fine with no 2K screen on the phone, probably will be the case for Europe, at least here in Poland for sure, 1080 is plenty on the small screen. I am happy the USB will be open, no more docks, but I do like using my dock haha. I hope the camera will have phase detection, otherwise I doubt they can improve the autofocus. I honestly wish Sony adds exposure control and ability to choose ISO like a real camera and real manual control, they can do it, and if the cybershot rumours are true, might as well add a decent camera with manual controls (minus aperture, this would be very hard on a tiny module).
JohnnyNr.5
Model not set
Joined: Jun 19, 2012
Posts: 357
PM
Posted: 2015-01-07 18:49
Reply with quoteEdit/Delete This PostPrint this post
If the 1080p version of the Z4 has the same specs like the QHD version but with a cheaper price point and better battery life then I'm totally fine.
[ This Message was edited by: JohnnyNr.5 on 2015-01-07 17:50 ]
MNX1024
Model not set
Joined: Jul 08, 2009
Posts: 413
PM
Posted: 2015-01-07 18:52
Reply with quoteEdit/Delete This PostPrint this post
I'm really hoping for Type-C, but not sure what to expect due to the G Flex 2 not using it.
amirprog
W800
Joined: Aug 22, 2013
Posts: > 500
From: Israel
PM
Posted: 2015-01-07 19:19
Reply with quoteEdit/Delete This PostPrint this post
@ascariss
Well, i don't get what's the point of having 2k for us and possibly other regions and 1080p for other markets. either have all with 1080p or all with 2k. i'm fine with 1080p for 5.2" but i feel that the reason for this is related to business between the panel manufacturers and sony and not a concern of sony for battery life.
can you explain from your knowldge why sony is yet to add manual focus and why not all camera features are available in the higher resolution modes? and also let's hope auto mode gets better.
CrownedAkuma
Sony Xperia Z2
Joined: Jun 28, 2013
Posts: > 500
From: North-East Italy
PM
Posted: 2015-01-07 19:30
Reply with quoteEdit/Delete This PostPrint this post

On 2015-01-07 18:49:59, JohnnyNr.5 wrote:
If the 1080p version of the Z4 has the same specs like the QHD version but with a cheaper price point and better battery life then I'm totally fine.
[ This Message was edited by: JohnnyNr.5 on 2015-01-07 17:50 ]


Couldn't agree more on this
nodarsixar
Sony Xperia Z
Joined: May 22, 2013
Posts: > 500
From: AD
PM, WWW
Posted: 2015-01-07 19:37
Reply with quoteEdit/Delete This PostPrint this post

On 2014-12-14 12:37:40, nodarsixar wrote:
we have Z4 spec!
5.2 inch
Snapdragon 810
3GB Ram
3300 Mah Battery
21MP Camera
New UI
Best AUDIO
IP68
140gram
LTE CAt 6


huiyi

AND



Z4 Compact- NO
Z4 Ultra - NO
Z5 Compact- YES



4GB RAM?
NO
Gitaroo
Model not set
Joined: Aug 17, 2013
Posts: > 500
PM
Posted: 2015-01-07 21:00
Reply with quoteEdit/Delete This PostPrint this post
16 GB internal storage needs to die.
Gitaroo
Model not set
Joined: Aug 17, 2013
Posts: > 500
PM
Posted: 2015-01-07 21:05
Reply with quoteEdit/Delete This PostPrint this post
so the sensor is confirmed to be a new one if it is 21 MP? The old one wasn't very good a 20 MP, I can't imagine what its going to be like if they stretch it to 21 MP.
CrownedAkuma
Sony Xperia Z2
Joined: Jun 28, 2013
Posts: > 500
From: North-East Italy
PM
Posted: 2015-01-07 22:50
Reply with quoteEdit/Delete This PostPrint this post

On 2015-01-07 21:05:10, Gitaroo wrote:
so the sensor is confirmed to be a new one if it is 21 MP? The old one wasn't very good a 20 MP, I can't imagine what its going to be like if they stretch it to 21 MP.

also if I recall it correctly the old one was a 1/2.3" sensor with 20.7 MPX whereas this is a 1/2.4" with 21MPX... it should be even worse?!?
JohnnyNr.5
Model not set
Joined: Jun 19, 2012
Posts: 357
PM
Posted: 2015-01-07 23:00
Reply with quoteEdit/Delete This PostPrint this post
I really hope Sony won't release an S810 equipped Z3 without flaps as the "new" Z4. 16GB ROM would be just....
cu015170
Nokia 808 PureView
Joined: Oct 26, 2010
Posts: > 500
PM
Posted: 2015-01-08 01:03
Reply with quoteEdit/Delete This PostPrint this post
you guys are so harsh... but I guess that's what real fans have to be
randomuser
Apple iPhone 5S
Joined: Sep 13, 2011
Posts: > 500
PM
Posted: 2015-01-08 05:58
Reply with quoteEdit/Delete This PostPrint this post

On 2015-01-07 21:05:10, Gitaroo wrote:
so the sensor is confirmed to be a new one if it is 21 MP? The old one wasn't very good a 20 MP, I can't imagine what its going to be like if they stretch it to 21 MP.


It's the same 20.7MP sensor as the Z1, Z2, Z3.
Xajel
Sony Xperia S
Joined: Jun 18, 2004
Posts: > 500
From: Bahrain
PM
Posted: 2015-01-08 07:11
Reply with quoteEdit/Delete This PostPrint this post

On 2015-01-07 18:13:14, ascariss wrote:

Here is a comparison from anandtech, again running Manhattan 1080p (offscreen) with X1’s GPU underclocked to match the performance of the A8X at roughly 33fps.

http://www.anandtech.com/show/8811/nvidia-tegra-x1-preview/3

NVIDIA’s tools show the X1’s GPU averages 1.51W over the run of Manhattan. Meanwhile the A8X’s GPU averages 2.67W, over a watt more for otherwise equal performance. This test is especially notable since both SoCs are manufactured on the same TSMC 20nm SoC process, which means that any performance differences between the two devices are solely a function of energy efficiency.


Not saying this is a super scientific test but I feel the 20nm and the arm cores help in power drainage. I feel the target for this chip will not be phones but tablets of course.


Video playback even @ 4K doesn't use all the resources specially when you have a hardware decoder which will make most of the CPU usage and GPU usage low ( not 0% as they still needs to do something )...
We're talking here about the maximum power usage in the worst real-word scenario, something like heavy gaming...

I'm not saying that Tegra is bad, it's very good but it's not 100% designed to be be a mobile chip that's why NVIDIA uses more than usual power just to compete with other makers.
Maxwell is good, in fact it's very good, they did a hell of a job with it, but lets face it, it's a desktop + laptop part at the end... NV optimised it for mobile, removed somethings, reduced power, using some LP manufacturing process to reduce power... but it can't be compared to a mobile designed fom ground GPU like Adreno & PowerVR for example...

to have some history :
Adreno is actually belongs to ATi which was acquired by AMD later including Adreno, after a while AMD sold all the mobile stuff as the stupid CEO wasn't relying in mobile ( which exploded after less than 2 years )... Qualcomm bought the mobile stuff from AMD including Adreno.
ATi designed Adreno as a graphics processor for mobile and special applications like TV's, Arcade gaming machines and so on, it was used in mobiles also but not that much as mobile were not that kind of hit at that time.
in the same time the have the other GPU for computers ( desktop & mobile ) which was a completely different design, both were separate projects to work with as they're targeting completely different segments, ofcourse they shared some technologies as they're graphics processors at the end, and Qualcomm continued the advances with it...

mobile designed GPU's and desktop/laptop class GPU's are still sharing technologies, for example mobile moved to unified shaders after the desktop class GPU's did the move first, mobile GPU's started to have DX support also, tessellation is another thing that started in discrete GPU's and then found it's way to mobile.

If you know more about PC advance, at the days of Pentium 3 then Pentium 4, AMD introduced Athlon then Athlon XP... that processor changed the balance as Athlon was faster than Intel for the first time in years... Intel tried to catch up with Athlon with Pentium 4 but they did it wrong as they increased the depth of the pipelines, while this can bring faster clock, it will also increase the power usage... they tried but never succeeded, until they tried it once more with Netburst architecture in the latest Pentium 4 designs, they increased the pipelines more as they were looking for 4 & 5GHz speeds they already have 6GHz clocks in their roadmap, but the chip was too hot to even reach 4GHz, consumed too much power and AMD was playing nice with Athlon 64 and Athlon 64 X2 by then. in the same time, another team in Intel was very happy as they designed a new architecture for laptops, it was very good, high performance and low power usage compared to any Pentium 3 & Pentium 4, that was the first Intel Core CPU... Intel then decided to ditch the Pentium completely and use the Core design even for desktops... and so they did and Core 2 was born... only then Intel flipped the formula, the Core 2 had a much shorter pipeline depth but much higher IPC, making it lower in clock but with give more performance, the lower clock meant also much lower power consumption compared to Pentium 4...

only by then Intel regained the crown as the best CPU maker from AMD, and AMD is still struggling as they never thought of power usage, their desktop parts still favours higher clocks to get performance and even their designs wasn't that good to compete, the latest desktop offers are still not that good compared to Intel, but there's still a hope and here were mobile parts come in hand

The main guy that designed Athlon, Athlon 64 in the beginning was smart, and still... that's why those parts were very good, but that guy left AMD and worked in Apple, and you can guess, he is the guy who leads the team which made A7 & A8 Apple processors... he know what he's doing...
before few months AMD re hired this guy again, and he's working on two multiple projects now including a high performance ARM design, and AMD's next x86 architecture also they call it AMD Zen, which should at least compete with Intel, which by that time should have Skylake or even the one after it as it will see the lights in 2016...
HxH
Sony Xperia S
Joined: Dec 26, 2008
Posts: > 500
From: GMT+7
PM
Posted: 2015-01-08 15:32
Reply with quoteEdit/Delete This PostPrint this post
I guess in 2006 when AMD acquired ATi, back then no one could estimate that how mobile market can growing so much influential. That's why we see PC's market is shrinking in extremely manner of today. In 2008 it was economic disaster AMD need to cut some unit that can't make money and also just year after introduced iPhone 2G, mobile device was starting to caught fire around 2010 and so on till today.

AMD got the master Jim Keller back from Apple, it would be interesting to see if he can make a miracle just once again but today AMD is quite limited resources (ie : money and foundry) and even unfortunately because economic climate around the world doesn't act so well either.
SONY x Sexy x Sleek x Solid x Stunner x Stylish : Nozomi/Kagura with Love.
Access the forum with a mobile phone via esato.mobi
Previous  123 ... 212223 ... 315316317  Next
Goto page:
Unlock this Topic Move this Topic Delete this Topic