Reply
Thread Tools
Community Council | Posts: 4,920 | Thanked: 12,867 times | Joined on May 2012 @ Southerrn Finland
#591
Originally Posted by Venemo View Post
That's what the Neo900 project is doing, for the reasons you mention.

The disadvantage of such an approach however is that the old hardware has very low performance by today's standard so you can't really have a very responsive GUI or anything. So if that chip were used, the device would have no market appeal to the average consumers.
I really doubt that; for example N9 performance is pretty good IMHO.
The real determining fact here is that only gaming people need high performance hardware

That's just about all, same as with PC computer video cards; it's the gaming people who need the performance.

I could not give a toss about games. I don't play games. At all.
 

The Following 9 Users Say Thank You to juiceme For This Useful Post:
Posts: 285 | Thanked: 1,900 times | Joined on Feb 2010
#592
Originally Posted by juiceme View Post
Why not if the said SoC or some OEM's drop-in replacement still would be available?
Assuming it was available, the price should also be pretty low, and since the chip architecture would be have been well tested and familiar then further silicon editions would surely be more optimized and be less-power hungry.
Also there'd be good time to hone and fix open source drivers.

Actually now as I come to think about it, there are nothing but good reasons to use an old SoC.
I wouldn't take lower price for older SoC granted. Usually they are made on older manufacturing process, which may be more expensive to manufacture, also they don't have economy of scale as the demand for them is generally lower. This of course in situation where one is not trying to tap into some leftover pieces. Newer manufacturing process usually means improved power efficiency as well as improved processing power, so going to mid-level reasonably fresh SoC and architecture instead of already abandoned tech IMO has it's benefits. Also, creating PCB with all the other components for old SoC may require more effort and add up costs. OMAP was decent architecture but AFAIK it didn't have much demand outside Nokia and when Nokia went with Qualcomm, TI didn't have much incentive to develop it further. Maybe if Nokia had bought it and made it "their own" like Apple did with their own SoC there would have been a differentiator... but it would have required MeeGo to succeed.

So, IMO there are not many good reasons for (really) old SoC if one is to create competitive device.
 

The Following 8 Users Say Thank You to JulmaHerra For This Useful Post:
Community Council | Posts: 4,920 | Thanked: 12,867 times | Joined on May 2012 @ Southerrn Finland
#593
Originally Posted by JulmaHerra View Post
I wouldn't take lower price for older SoC granted. Usually they are made on older manufacturing process, which may be more expensive to manufacture, also they don't have economy of scale as the demand for them is generally lower. This of course in situation where one is not trying to tap into some leftover pieces. Newer manufacturing process usually means improved power efficiency as well as improved processing power, so going to mid-level reasonably fresh SoC and architecture instead of already abandoned tech IMO has it's benefits. Also, creating PCB with all the other components for old SoC may require more effort and add up costs. OMAP was decent architecture but AFAIK it didn't have much demand outside Nokia and when Nokia went with Qualcomm, TI didn't have much incentive to develop it further. Maybe if Nokia had bought it and made it "their own" like Apple did with their own SoC there would have been a differentiator... but it would have required MeeGo to succeed.

So, IMO there are not many good reasons for (really) old SoC if one is to create competitive device.
I don't fully buy into that explanation.

As it happens, device battery runtimes have remained about the same and in the long view gone dramatically down even as battery capacity and technologies have improved all the time.

I hold the view that going for all-the-time-faster CPU's and new architectures is the culprit to blame.

On the other hand if you had an existing SoC that would be evolved in manufacturing technology but not tried to squeeze more power out of it would certainly come more power-efficient over generations.
And having same drivers that could be optimized properly and not some quick-hack-let's-just-make-android-compatible-drivers-now would leverage to get more out of the HW.
 

The Following 4 Users Say Thank You to juiceme For This Useful Post:
Posts: 285 | Thanked: 1,900 times | Joined on Feb 2010
#594
Originally Posted by juiceme View Post
As it happens, device battery runtimes have remained about the same and in the long view gone dramatically down even as battery capacity and technologies have improved all the time.
True, however, also software has evolved quite much during that time. Both in features and security, which makes it heavier to run. Some of that may be possible to improve with fine tuning software, but in current world it's usually easier and cheaper to get acceptable results by trowing more iron to the border. It's the natural result from quicker release of software.

Also, ie. iPhones have relatively small battery capaciy (as they are intended to be smaller and thinner devices while packing some serious SoC performance), yet they have decent battery life, comparable to devices with significantly bigger batteries.

I hold the view that going for all-the-time-faster CPU's and new architectures is the culprit to blame.
That's only one part, the other part is (as I mentioned previously) shorter release cycles to bring new features to the market quicker.

On the other hand if you had an existing SoC that would be evolved in manufacturing technology but not tried to squeeze more power out of it would certainly come more power-efficient over generations.
This depends on what you are going to do with it. It's not reasonable to throw latest SoC in to just write text messages, but running underpowered SoC on some other workload usually means higher power consumption as higher loads last longer periods of time.

And having same drivers that could be optimized properly and not some quick-hack-let's-just-make-android-compatible-drivers-now would leverage to get more out of the HW.
True, but in situation where SoC-manufacturers more or less hate native Linux, it's not viable in the long run. AFAIK there are not many relevant SoC manufacturers that offer native Linux drivers or even documentation needed to create open source drivers for them. In that sense it's unfortunate that ST-Ericcson falled apart, thatSoC would have had native drivers supported by the manufacturer.
 

The Following 6 Users Say Thank You to JulmaHerra For This Useful Post:
Posts: 3 | Thanked: 15 times | Joined on Aug 2017
#595
Originally Posted by juiceme View Post
I don't fully buy into that explanation.

As it happens, device battery runtimes have remained about the same and in the long view gone dramatically down even as battery capacity and technologies have improved all the time.

I hold the view that going for all-the-time-faster CPU's and new architectures is the culprit to blame.

On the other hand if you had an existing SoC that would be evolved in manufacturing technology but not tried to squeeze more power out of it would certainly come more power-efficient over generations.
And having same drivers that could be optimized properly and not some quick-hack-let's-just-make-android-compatible-drivers-now would leverage to get more out of the HW.
Hum no. Recent CPU are far better battery efficient than old ones.

And there are recent CPU specifically build to offer a good mix between power and lower energy consumption, like the Snapdragon 625 which is the choice Chen made for the Livermorium.

It's mainly the screens which have a big impact on battery, since they are more and more big and with a always bigger resolution.

And no, you don't need to be a gamer to need more power than what the CPU from 2009 can offer.

Web browsing need far more horse power now than in 2009, cameras need power too for pictures treatment, and so on.

Don't forget also the now the 4G modem is directly on the CPU, and no longer a separate piece.

Envoyé de mon LG-V500 en utilisant Tapatalk
 

The Following 6 Users Say Thank You to Trouveur For This Useful Post:
Venemo's Avatar
Posts: 1,296 | Thanked: 1,773 times | Joined on Aug 2009 @ Budapest, Hungary
#596
Originally Posted by juiceme View Post
I really doubt that; for example N9 performance is pretty good IMHO.
The real determining fact here is that only gaming people need high performance hardware
Well, the N9 has kind of sort of okay-ish performance on its low-res screen, when you don't open too many websites and if you don't mind laggy scrolling.

Don't misunderstand me, the N9 was a great device in its time and I loved it. But times have moved on and it's not the state of the art anymore.
 

The Following 6 Users Say Thank You to Venemo For This Useful Post:
Community Council | Posts: 4,920 | Thanked: 12,867 times | Joined on May 2012 @ Southerrn Finland
#597
And now, seriously, hands-up anyone who really needs those all-the-time-more high-resolution screens?
And please define why, since the pixels already are too small to see withlut a microscope??
I hold that even halfHD resolution is almost too much for 5" device...

And cameras! people have gone straight off the edge megapixel-crazy. And it's the Mpx:es that consume power/memory/cycles.

Instead I'd opt for less pixels and better optics...
I am willing to bet anyone immediately that I can capture more striking image of any given subject/situation with my 15 years old DSLR having only 3 megapixels than any and all daring to contest me with latest mobile devices!
 

The Following 6 Users Say Thank You to juiceme For This Useful Post:
Venemo's Avatar
Posts: 1,296 | Thanked: 1,773 times | Joined on Aug 2009 @ Budapest, Hungary
#598
Originally Posted by juiceme View Post
And now, seriously, hands-up anyone who really needs those all-the-time-more high-resolution screens?
And please define why, since the pixels already are too small to see withlut a microscope??
I hold that even halfHD resolution is almost too much for 5" device...
I'd say at least a HD (but not necessarily full HD) is nice.

Originally Posted by juiceme View Post
And cameras! people have gone straight off the edge megapixel-crazy. And it's the Mpx:es that consume power/memory/cycles.

Instead I'd opt for less pixels and better optics...
I am willing to bet anyone immediately that I can capture more striking image of any given subject/situation with my 15 years old DSLR having only 3 megapixels than any and all daring to contest me with latest mobile devices!
Completely agree on this.
My best camera phone so far has been the Nokia N95 (N900 and N950 doesn't even come close, N9 was mostly okay).

However...

"Average" consumers won't be interested in this device if it doesn't participate in the "specs war".
 

The Following 4 Users Say Thank You to Venemo For This Useful Post:
Posts: 285 | Thanked: 1,900 times | Joined on Feb 2010
#599
Well... Retina on 4.7" device is ok for me, Full HD on 13" laptop is kind of OK but I wouldn't mind even higher resolution on it. High resolution screens do have their benefits, especially on clarity of smaller objects and text.

On cameras, it's not those megapixels per se that cause the need for more CPU-power, but all those other features that actually make the images taken with cameraphones bearable. They use quite a bit of software touchup, not to even mention the DSP power to improve sound quality on videos. Of course you could try to build phone into DSLR but who would carry such a device around? Honestly, last holiday trip to Europe I didn't bother to carry my DSLR with me with all the other stuff already packed to small car with two kids and wife, also quality of videos taken from several concerts are quite good (and I haven't even tried 4k yet).

One thing that surprises me is the urge to stick to everything really old. As if all progress and development was inherently bad and should be avoided at all cost just because in theory it's possible to achive feature X using minimal amount of CPU power, althrough making it work takes considerable amount of human resources and time (not to mention cash). AFAIK last mobile OS created with such mindset was Symbian and that ended up being major PITA for both developers and users.
 

The Following 3 Users Say Thank You to JulmaHerra For This Useful Post:
Guest | Posts: n/a | Thanked: 0 times | Joined on
#600
Originally Posted by juiceme View Post
And now, seriously, hands-up anyone who really needs those all-the-time-more high-resolution screens?
Since you asked, I'll raise my hand. A lot of the systems I design per day are now being moved from desktops and laptops to tablets and now even phones. That last one is surprising even to me since I design enterprise level applications.

And please define why, since the pixels already are too small to see withlut a microscope??
Good UX doesn't mean scaling the controls down to a point you'll require a microscope. I think what happens is that whenever a desktop application is ported to a phone, it is not adapted for the screen in use.

That's always been my biggest issue with running Debian on the N900. Glad it works, sucks that it's not optimized screen-wise.

I hold that even halfHD resolution is almost too much for 5" device...
We differ here. Again, folks keep bringing desktop Linux to a 5" screen without adapting a damn thing.

Sidenote: Why do we still use inches for screens but metric for everything else?

And cameras! people have gone straight off the edge megapixel-crazy. And it's the Mpx:es that consume power/memory/cycles.
Absolutely agree here. I'm not a fan of the current megapixel race. It hits me the same nonsensical way as the gigahertz race between Intel and AMD back in the day.

AMD actually won this one.

Instead I'd opt for less pixels and better optics...
I'd argue for better adapted UI's that work on the space given to them. That part is often overlooked in these opensource projects. Wonderful functionality, **** design, even more ****ty concept of what the users want to do.

Not all users are engineers. Or geeks.
 

The Following 9 Users Say Thank You to For This Useful Post:
Reply

Tags
n950 revival, q-device, qwerty keyboard, sailfishos, sailingchen


 
Forum Jump


All times are GMT. The time now is 21:39.