Android History and Features,Android Update

Android History and Features,Android Update

What do you mean by Android? 


Android is a mobile operating system developed by Google. It is used by several smartphones and tablets.


Examples : include the Sony Xperia, the Samsung Galaxy, and the Google Nexus One. ... Unlike Apple's iOS, Android is open source,meaning developers can modify and customize the OS for each phone.

What is the first Android phone?


The first android phone was launched by HTC on 22nd October 2008. HTC Dream which is also known as the T-Mobile G1 in the United States of America and some parts of Europe is the first commercially launched device to be powered by Androidoperating system.

What do you mean by Android OS?


The Android OS is an open source operating system primarily used in mobile devices. Written primarily in Java and based on the Linux operating system, it was initially developed by Android Inc. and was eventually purchased by Google in 2005.

What is the difference between a smartphone and an android?


Mobile Operating System: A mobile operating system, also referred to as mobile OS, is an operating system that operates a smartphone, tablet, PDA, or other mobile device. So a smartphone that runs on Android OS is an Android phone,you may call it A smartphone/android phone/android-smart-phone.

Are all Android phones smartphones?


The answer is YES. All Android phones run on the powerful Android Operating System and are considered Smartphones. So when choosing a smartphone the Operating System is one of the first puzzles to solve and will get the ball rolling towards your final choice.

 

  History


Android Inc. was founded in Palo Alto, California in October 2003 by Andy RubinRich Miner, Nick Sears, and Chris White. Rubin described the Android project as "tremendous potential in developing smarter mobile devices that are more aware of its owner's location and preferences".The early intentions of the company were to develop an advanced operating system for digital cameras, and this was the basis of its pitch to investors in April 2004 The company then decided that the market for cameras was not large enough for its goals, and by five months later it had diverted its efforts and was pitching Android as a handset operating system that would rival Symbian and Microsoft Windows Mobile.

Rubin had difficulty attracting investors early on, and Android was facing eviction from its office space. Steve Perlman, a close friend of Rubin, brought him $10,000 in cash in an envelope, and shortly thereafter wired an undisclosed amount as seed funding. Perlman refused a stake in the company, and has stated "I did it because I believed in the thing, and I wanted to help Andy.

The "Sooner" prototype phone, running a pre-release version of Android


What are the versions of Android?









This is your quick primer on the the different versions of Android that are still alive and kicking, from newest to oldest.

 Android 8.1 Oreo (2017)


Android Oreo is the 8th major release of the Android operating system. It is first released as a developer preview on October 25, 2017, with factory images for current Nexus and Pixel devices. A second developer preview was made available on November 27, 2017 for Nexus and Pixel devices, before the stable version was released on December 5, 2017




Android X.X Nougat (2016)




In March 2016 (March!!!), Google surprised pretty much everyone by releasing the N Developer Preview a full month and a half ahead of the yearly Google I/O developer conference. This gives app developers (and hard-core nerds) the opportunity to taste the next major flavor of Android before it's actually released. On June 30, Google gave us the nickname: Nougat.

 

Android 6.0 Marshmallow (late 2015)




Android 6.0 "Marshmallow" was unveiled under the codename "Android M" during Google I/O on May 28, 2015, for the Nexus 5 and Nexus 6 phones, Nexus 9 tablet, and Nexus Player set-top box, under the build number MPZ44Q. The third developer preview (MPA44G) was released on August 17, 2015 for the Nexus 5, Nexus 6, Nexus 9 and Nexus Player devices, and was updated to MPA44I that brought fixes related to Android for Work profiles.

Android 5.0 Lollipop (late 2014)




Android 5.0 "Lollipop" was unveiled under the codename "Android L" on June 25, 2014, during Google I/O. It became available as official over-the-air (OTA) updates on November 12, 2014, for select devices that run distributions of Android serviced by Google, including Nexus and Google Play edition devices. Its source code was made available on November 3, 2014.

Lollipop features a redesigned user interface built around a responsive design language referred to as "material design". Other changes include improvements to the notifications, which can be accessed from the lockscreen and displayed within applications as top-of-the-screen banners. Furthermore, Google made internal changes to the platform, with the Android Runtime (ART) officially replacing Dalvik for improved application performance, and with changes intended to improve and optimize battery usage, known internally as Project Volta.

Android 4.4 KitKat (late 2013)


Google announced Android 4.4 KitKat on September 3, 2013. Although initially under the "Key Lime Pie" ("KLP") codename, the name was changed because "very few people actually know the taste of a key lime pie."Some technology bloggers also expected the "Key Lime Pie" release to be Android 5.KitKat debuted on Google's Nexus 5 on October 31, 2013, and was optimized to run on a greater range of devices than earlier Android versions, having 512 MB of RAM as a recommended minimum; those improvements were known as "Project Svelte" internally at Google.The required minimum amount of RAM available to Android is 340 MB, and all devices with less than 512 MB of RAM must report themselves as "low RAM" devices


Android 4.1-4.3 Jelly Bean (mid-2012)


Google was expected to announce Jelly Bean 4.2 at an event in New York City on October 29, 2012, but the event was cancelled due to Hurricane Sandy.Instead of rescheduling the live event, Google announced the new version with a press release, under the slogan "A new flavor of Jelly Bean". Jelly Bean 4.2 was based on Linux kernel 3.4.0, and debuted on Google's Nexus 4 and Nexus 10, which were released on November 13, 2012.


Android 4.0 Ice Cream Sandwich (late 2011)


Ice Cream Sandwich

The follow-up to Honeycomb was announced at Google IO in May 2011 and released in December 2011. Dubbed Ice Cream Sandwich and finally designated Android 4.0, ICS brought many of the design elements of Honeycomb to smartphones, while refining the Honeycomb experience.

The first device to launch with ICS was the Samsung Galaxy Nexus. The Motorola Xoom and the ASUS Transformer Prime were the first tablets to receive updates, while the Samsung Nexus S was the first smartphone to make the jump to Android 4.0.

Android 3.x Honeycomb (early 2011)


Android 3.0 Honeycomb came out in February 2011 with the Motorola Xoom. It's the first (and only) version of Android specifically made for tablets, and it brought a lot of new UI elements to the table. Things like a new System bar at the bottom of the screen to replace the Status bar we see on phones, and a new recent applications button are a great addition for the screen real estate offered by Android tablets.

Some of the standard Google applications have also were updated for use with Honeycomb, including the Gmail app and the Talk app.  Both made great use of fragments, and the Talk app added video chat and calling support built in.  Under the hood, 3D rendering and hardware acceleration have been greatly improved.

We can't talk about Honeycomb without mentioning that it also shows Google's new distribution method, where manufacturers are given the source code and license to use it only after their hardware choices have been approved by Google. This dampens third party development, as the source code is no longer available for all to download and build. And, in fact, Google never released the Honeycomb source.

Improvements to Honeycomb were announced at Google IO in May 2011 as Android 3.1, and Android 3.2 followed thereafter. But Honeycomb basically is regarded as a forgotten version.

Android 2.3 Gingerbread (late 2010)


Android 2.3 Gingerbread came out of the oven in December 2010, and like Eclair had a new "Googlephone" to go along with — the Nexus S.  Gingerbread brings a few UI enhancements to Android, things like a more consistent feel across menus and dialogs, and a new black notification bar, but still looks and feels like the Android we're used to, with the addition of a slew of new language support.

Gingerbread brings support for new technology as well.  NFC (Near Field Communication) is now supported, and SIP (Internet calling) support is now native on Android. Further optimizations for better battery life round out a nice upgrade.

Behind the scenes, the fellows at Mountain View spent time with more JIT (the Just-In-Time compiler) optimizations, and made great improvements to Androids garbage collection, which should stop any stuttering and improve UI smoothness.  Round that out with new a multi-media framework for better support of sound and video files.

Android 2.2 Froyo (mid-2010)


Android 2.2 Froyo was announced in May 2010 at the Google IO conference in San Francisco. The single largest change was the introduction of the Just-In-Time Compiler — or JIT — which significantly speeds up the phone's processing power.

Along with the JIT, Android 2.2 also brings support for Adobe Flash 10.1. That means you can play your favorite Flash-based games in Android's web browser. Take that, iPhone!

Froyo also brought native support for tethering, meaning you could use your Android smartphone's data connection to provide Internet (wirelessly or with a USB cable) to just about any device you want. Sadly, most carriers will strip this native support in exchange for some sort of feature they can charge for. (Can't really blame them, can you?)

Android 2.0-2.1 Eclair (late 2009)


Eclair was a pretty major step up over its predecessors. Introduced in late 2009, Android 2.0 first appeared on the Motorola Droid, bringing improvements in the browser, Google Maps, and a new user interface. Google Maps Navigation also was born in Android 2.0, quickly bringing the platform on par with other stand-along GPS navigation systems.

Android 2.0 quickly gave way to 2.0.1, which the Droid received in December 2009, mainly bringing bugfixes. And to date, the Droid remains the phone phone to have explicitly received Android 2.0.1.

The now-defunct Google Nexus One was the first device to receive Android 2.1 when it launched in January 2010, bringing a souped-up UI with cool 3D-style graphics. From there, the rollout of Android 2.1 has been relatively slow and painful. Manufacturers skipped Android 2.0 in favor of the latest version but needed time to tweak their customizations, such as Motorola's Motoblur.

HTC's Desire and Legend phones launched with Android 2.1 later in the year, touting a new and improved Sense user interface.

Android 1.6 Donut (late 2009)


Donut, released in September 2009, expanded on the features that came with Android 1.5. While not very rich in the eye-candy department, Android 1.6 made some major improvements behind the scenes, and provided the framework base for the amazing features to come.  To the end user, the two biggest changes would have to be the improvements to the Android Market, and universal search.

Behind the screen, Donut brought support for higher-resolution touchscreens, much improved camera and gallery support, and perhaps most importantly, native support for Verizon and Sprint phones. Without the technology in Android 1.6, there would be no Motorola Droid X or HTC EVO 4G — two major phones for those carriers.

The devices released with Android 1.6 cover a wide range of taste and features, including the Motorola Devour, the Garminphone, and the Sony Ericsson Xperia X10.

Android 1.5 Cupcake (mid-2009)


Cupcake was the first major overhaul of the Android OS.  The Android 1.5 SDK was released in April 2009 and brought along plenty of UI changes, the biggest probably being support for widgets and folders on the home screens.

There were plenty of changes behind the scenes, too.  Cupcake brought features like improved Bluetooth support, camcorder functions, and new upload services like YouTube and Picasa.

Android 1.5 ushered in the era of the modern Android phone, and the explosion of devices included favorites like the HTC Hero and Eris, the Samsung Moment, and the Motorola Cliq.

Question you may ask


Is Android owned by Google?


Initially developed by Android Inc., which Google bought in 2005, Android was unveiled in 2007, along with the founding of the Open Handset Alliance – a consortium of hardware, software, and telecommunication companies devoted to advancing open standards for mobile devices.

What is better an iphone or an android?


Both Android and the iPhone show you the time when you press the lock button. But many Android phones do time keeping better. Motorola, Samsung and HTC have smart covers and sensors that automatically show you the time when you pull the phone out of your pocket.


What is the difference between a smartphone and an I phone?


Key DifferenceiPhone is the flagship phone developed and manufactured solely by Apple. The device operates on Apple's iOS operating system and is currently in its 5thgeneration. Smartphones are any mobile phones that are similar to a mini computer. ... Smartphones and iPhones are two terms that actually go hand-in-hand.

What is the difference between a cell phone and a mobile phone?


Cellphone is short for cellular phone. This is the name given to portable phones that use cellular technology. Since, they are generally portable they are also called mobile phones. ... So the smartphone is a mobile phone with a lot of functionality that is also a cellular phone because it uses cellular network technology.

What is the best version of Android?


They were certainly great releases, just as Android marshmallow will be, but for the sense of Android coming into its own, I can't go past Jelly Bean. It's Obvious latest version of android are the best one. ... Now-a-days kitkat, lollipop and marshmallow is the best operation system for the android!!

Is Google using Linux?


Google's desktop operating system of choice is Ubuntu Linux. San Diego, CA: MostLinux people know that Google uses Linux on its desktops as well as its servers. ... But almost no one outside of Google knew exactly what was in it or what roles UbuntuLinux plays on Google's campus, until now.


What is Android 7.0 name?


Following Android Alpha and Android Beta, Google has always named its AndroidOS updates after sweet treats, and in alphabetical order. So far we've had Cupcake, Donut, Eclair, Froyo, Gingerbread, Honeycomb, Ice Cream Sandwich, Jelly Bean, KitKat, Lollipop and Marshmallow. In 2016 we have Nougat.

Is Android better than Windows?




Windows Phone is not an open source platform and Microsoft has a stricter criteria setthan Google about which apps and games can populate their respective marketplaces. As a result, the app store responds with superior and better apps, and cleaner options, than what Android apps can offer.


Is Android different from Windows?


Key Difference: Android is an open source, free, Linux-based operating system for smartphones and tablets. The system was designed and developed by Android Inc., which was funded and later purchased by Google in 2005. Windows Phone is a series of proprietary software developed and marketed by Microsoft Corporation.

Which is the best operating system for mobile?


Comparison Of Top Mobile OS



  • Symbian.

  • Android.

  • September 20th, 2008 was the date when Google released the first Android OS by the name of 'Astro'. After sometime next upgraded versions 'Bender' and 'Cupcake' were also released. ...

  • Apple iOS.

  • Blackberry OS.

  • Windows OS.

  • BADA.

  • Palm OS (Garnet OS)


Why Android is better than Windows?


All we need now is a few more decent apps and some compelling Windows 10 Mobile hardware (a Surface Phone, maybe?), and iOS and Android might have to think about glancing back for a second. Built-in apps also make a big difference to how the operating system functions

Can I install Windows on a android phone?


Take a look at iOS or Windows phone devices, they can run android because it is an open source OS, so it isin't so hard to port it. Here is Windows and Android dual boot proof on tablet, just google: Chuwi HI8 tablet. layse said: I want to install Windows 8 on my smartphone.

Is Android compatible with Windows 10?


Microsoft already has OneDrive, OneNote, Skype, Outlook, and Microsoft Office apps for Android and iPhone. All are free. ... The soon-to-be-released apps will let you see photos taken your phone in Microsoft's Windows 10 PC Photos app.

Created by Mustafa Baig


 






DMCA.com Protection Status
Read More

Hands On With Google’s Pixel 2 XL: More Pixels, More Google

Hands On With Google’s Pixel 2 XL: More Pixels, More Google


 

This morning at a press event in San Francisco, Google held the second of what has now become an annual hardware event for the company, their Made by Google event. As with last year’s show, this year’s presentation showcased a mix of different devices and accessories from Google. However the most anticipated device for enthusiasts across the spectrum was without a doubt the next Google Pixel phones, which as expected made their introduction today as the Google Pixel 2 and Pixel 2 XL.

Last year’s introduction of the Pixel phone family was a significant departure from tradition for Google. The company retired the Nexus lineup of aggressively priced mid-range/high-end phones in favor of what is best called Google’s take on what a flagship Android phone should be. The Pixels had cutting-edge specifications and features; they also had a flagship price. Depending on who you ask and what statistics you use, it can be argued just how much of the Android handset market that the Pixel phones actually captured – a situation not helped by the Pixel phones being so hard to get for a while – however what can’t be argued is that it had a definite impact on the expectations of the Android phone market.


With the original Pixel phones, Google and its manufacturing partner HTC created a solid phone that was perhaps a bit derivative in design, but none the less had a unique aesthetic to it that helped to separate it from other flagship phones. The Pixel 2 phones, in turn, do not significantly rock the boat here. Instead they come off as a natural evolution of the original pixel phones.


 



Under the hood, you’ll find all the bells and whistles you’d come to expect from a flagship smartphone in 2017. Google is using Qualcomm’s Snapdragon 835 SoC, which is paired with 4GB of LPDDR4x RAM. As with the first-generation Pixel phones, outside of screen resolution and battery size, both phones share the same internals; so we’re looking at the same storage, RAM, SoC, camera modules, etc. In other words, the Pixel 2 XL really is a larger version of the Pixel 2, rather than a technically superior version.

 

 

 



That said, even within the limited confines of display and battery size changes, the phones do stand apart. The Pixel 2 retains its predecessor’s 5-inch 1080p AMOLED display. I didn’t get hands-on time with this phone, but I expect that it will have the same Pentile arrangement as its predecessor as well. Meanwhile the Pixel 2 XL gets a new, larger screen; instead of a 5.5-inch 1440p AMOLED, it’s now rocking a 6-inch 2880x1440 pOLED (Plastic OLED) display. Google is not discussing who is providing the pOLED display, but given LG’s recent actions in this space, I suspect they are the supplier. Regardless, the pOLED display does end up being a bit better than the Pixel 2’s display, offering 100% DCI-P3 coverage rather than 95%.

Meanwhile new to the Pixel phone family, both phones now have always-on screens courtesy of their OLED displays. Google calls both of the OLED displays vivid, and that’s certainly the case for the Pixel 2 XL I got to spend some time with. Android’s color space management limitations are well-known, and I am eager to see if Google has done something to improve the situation on their own phones.

 



On the literal flipside of the cameras is the Pixel 2 family’s 12.2MP camera. In terms of resolution this is very similar to last year’s phones, but this is clearly a new sensor. The pixel size is smaller, at 1.4um versus 1.55um, and the aperture is now f/1.8. Perhaps the single biggest technical change here is that after not including Optical Image Stabilization (OIS) in last year’s phones, the Pixel 2s now get OIS, and OIS can be used alongside EIS, which Google calls Fused Video Stabilization.

In terms of overall quality, one of the big focal points of the original Pixel phones was to have the highest quality smartphone camera on the market. And while the Pixel has since been surpassed, Google is continuing to pursue that direction with the Pixel 2. While DxOMark is not the sole arbitrator of camera quality, the record-setting score of 98 means that the Pixel 2 phones should be very competitive in the market, and that the Pixel 2 will be worth keeping an eye on.



Speaking of cameras, Google has also added some computational photography features to the phones via a portrait mode. This is a particularly interesting development since the phones still only contain a single rear-facing camera. So Google is doing the necessary depth mapping without the benefit of a second camera and the parallax effect to isolate the foreground from the background. Google has been a major player in the computer vision space, and I’m very interested in seeing how well their tech works in practice, as Google is definitely taking the hard way towards portrait mode by going this route. On the plus side, because they don’t need two cameras, the front-facing camera can be used for portrait mode as well.

Moving on, in terms of build quality, Google has taken a step up with the Pixel 2 phones. Both phones are now IP67 water and dust resistant, the latter in particular being a major improvement over the original Pixel’s much more limited IP53 resistance. Google’s one of the last flagship vendors to add this level of water resistance, but none the less it’s a welcome development.

Overall the phones are just a bit larger than their original counterparts. The Pixel 2 is a couple of mm taller and wider, and the same goes for the Pixel 2 XL as well, despite the taller 18:9 aspect ratio display. Also, after avoiding a camera hump on the original Pixels, there is now one present on the Pixel 2s (almost certainly a result of including OIS).



Otherwise the aluminum body feels very similar to the original phones. In fact at just about every level, the Pixel 2 phones feel like the original Pixel. The 2 is not a radical design departure – nor does it need to be – so the whole thing feels very similar in-hand. If you were comfortable with a Pixel, you’ll likely be comfortable with a Pixel 2.

Which isn’t to say that Google’s phone hasn’t learned some new tricks. Taking a page from HTC’s playbook (who I’m assuming is building the phone again), Google has added squeezing as an input action to the phone. Google calls this Active Edge, and it can be configured to invoke various actions. The default action, fittingly enough, activates Google Assistant, giving Google a dedicated non-voice action for Assistant without adding a button. Active Edge will even work with cases, including Google’s new Pixel 2 cases.

Speaking of Google assistant, unsurprisingly, software is a big part of Google’s pitch with the Pixel 2 family. In fact the company was rather candid in their keynote that with the gradual slowing of Moore’s Law and general hardware development, they can’t release a radically different phone every single year. As a result the company has embarked on what they’re calling a Hardware + Software + AI focus for their products. This embodies the above hardware, numerous features offered by Google Assistant – including a suite of features coming over from or integrating with Google Home – and then the rest of the pure Android 8.0 Oreo software stack. Pixel users will also be getting a preview release of Google's Lens functionality later this year. Broadly speaking, Google is following a similar trend as other handset manufactures, moving from competing just on specifications to making a complete ecosystem/lifestyle play.



Otherwise, despite an overall strong hardware configuration, hardware enthusiasts will likely come away disappointed with one thing: the loss of the 3.5mm audio jack. Google has removed the audio jack for the Pixel 2, similar to so many other phones in the last year. Instead users will need to use an adapter, USB-C headphones, or Bluetooth headphones. No doubt removing the jack helps with that IP67 waterproofing, but officially Google hasn’t said why the jack has been removed.

Rounding out the experience, like the original Pixel, Verizon is once again Google’s exclusive carrier launch partner in the US, meaning they are the only carrier selling the phone. However as this is a global launch, Google is actually selling both Verizon and unlocked versions of the phones. So buyers will have the option of going to other carriers, with a bit more effort.

Finally, the Pixel 2 phones will begin shipping on October 17th, with pre-orders starting immediately. The Pixel 2 is being priced at $649 for a 64GB model, while the Pixel 2 XL will go for $849 for the same capacity. For another $100, both phones can be upgraded to 128GB of storage. 

written by Mustafa Baig</h1m
DMCA.com Protection Status

Read More

HiSilicon Kirin 970 - Android SoC Power & Performance Overview

HiSilicon Kirin 970 - Android SoC Power & Performance Overview


Today I would say that there’s only two truly vertically integrated mobile OEMs who have full control over their silicon: Apple and Huawei – and of the two one could say Huawei is currently even more integrated due to in-house modem development. Huawei’s semiconductor division, HiSilicon, has over the last several years been the one company which seems to have managed what the others haven’t: break in into the high-end market with solutions that are competitive with the current leader in the business, Qualcomm.

I remember the Honor 6 with the newly branded (Previously not having any “halo” line-up name) Kirin 920 SoC as the first device with the company’s in-house SoC that we reviewed. These and the following generation the Kirin 930 suffered from immaturity with problems such as a very power hungry memory controller and very disappointing camera processing pipeline (ISP/DSP). The Kirin 950 was in my opinion a turning point for HiSilicon as the product truly impressed and improved the quality of the product, catching many eyes in the semiconductor industry, including myself in the resulting review of the Huawei Mate 8.

Over the last several years we’ve seen great amounts of consolidation in the mobile semiconductor industry. Companies such as Texas Instruments which were once key players no longer offer mobile SoC products in their catalogue. We’ve seen companies such as Nvidia try and repeatedly fail at carving out meaningful market-share. MediaTek has tried providing higher end SoCs with the Helio X line-up with rather little success to the point that the company has put on hold development in that segment to rather focus on higher margin parts in the P-series.
DMCA.com Protection Status

Meanwhile even Samsung LSI, while having a relatively good product with its flagship Exynos series, still has not managed to win over the trust of the conglomorate's own mobile division. Rather than using Exynos as an exclusive keystone component of the Galaxy series, Samsing has instead been dual-sourcing it along with Qualcomm’s Snapdragon SoCs. It’s therefore not hard to make the claim that producing competitive high-end SoCs and semiconductor components is a really hard business.

Last year’s Kirin 960 was a bit of a mixed bag: the SoC still delivered good improvements over the Kirin 950 however it was limited in terms of what it could achieve against competing flagship SoCs from Samsung and Qualcomm as they both had a process node advantage. Huawei's introduction of flagships with new generation of SoCs in the fourth quarter is more close to the release time-frame of Apple than the usual first quarter that we’ve come accustomed of Qualcomm and Samsung.

As such when pitting the Kirin versus Snapdragon and Exynos’s we’re looking at a product that’s more often than not late to the party in terms of introduction of new technologies such as process node and IP. The Kirin 970 fits this profile: as a 10nm Cortex-A73 generation-based SoC, it lagged behind Qualcomm and Samsung in terms of process node, yet being too early in its release to match up with ARM’s release schedule to be able to adopt DynamiQ and A75 and A55 based CPU cores for this cycle. That being said the Kirin 970 enjoys a few months with technical feature parity with the Snapdragon 835 and Exynos 8895 before we see new Snapdragon 845 and Exynos 9810 products later in the usual spring refresh cycle.

Nevertheless, the article today is a focus on the Kirin 970 and its improvements and also an opportunity to review the current state of SoCs powering Android devices.



The Kirin 970, isn't a major IP overhaul as it continues to use the same central processing unit IP from ARM that was used in the Kirin 960. The new SoC even doesn't improve the frequency of the CPU clusters as we still see the same 2.36GHz for the A73 cores and 1.84GHz for the A53 cores. When ARM originally launched the A73 we had seen optimistic targets of up to 2.8GHz on TSMC 10nm, but we seem to have largely missed that target, a sign of ever increasing difficulty to scale frequency in mobile SoCs as the diminishing returns from process node updates become worse and worse.

The Kirin 970 does bring a major overhaul and change in the GPU configuration as we see the first implementation of ARM’s Mali G72 in a 12-cluster configuration, a 50% increase in core count over the Kirin 960’s G71-MP8 setup. The new GPU is running at a much reduced frequency of 746MHz versus the 1033MHz of the Kirin 960. In Matt Humrick’s review of the Kirin 960 we saw some disastrous peak average power figures of the Mali G71 outright exploding the thermal envelope of the Mate 9, so hopefully the architectural improvements of the new G72 alongside a wider and lower clocked configuration in conjunction with the new process node will bring significant improvements over its predecessor.

The new modem in the Kirin 970 now implements 3GPP LTE Release 13 and supports downlink speeds of up to 1200Mbps thanks to up to 5x20MHz carrier aggregation with 256-QAM, making the new Kirin modem feature equivalent to Qualcomm’s X20 modem that’ll be integrated in the Snapdragon 845.

The big story surrounding the Kirin 970 was the inclusion of a dedicated neural processing unit. The NPU, as HiSilicon decided to name it, is part of a new type and generation of specialised dedicated acceleration blocks with the aim of offloading “inferencing” of convolutional neural net (CNNs). Many will have heard buzzwords such as artificial intelligence surrounding the topic, but the correct term is machine learning or deep learning. The hardware acceleration blocks with various names from various companies do not actually do any deep learning, but rather are there to improve execution (inferencing) of neural network models while the training of the models will still remain something that will be done either in the cloud or by other blocks in the SoC such as the GPU. It’s still the early days but we’ll have a proper look at the NPU in its dedicated section of the article.



As aforementioned one of the bigger improvements of the Kirin 970 is the switch to a TSMC 10FF manufacturing node. While 10nm is supposed to be a long-lived node for Samsung's foundry – where indeed we’ll see two full generations of SoCs produced on 10LPE and 10LPP – TSMC is taking a different approach and sees its own 10FF process node a short-lived node and stepping-stone to the much anticipated 7FF node, which is to be introduced later in 2018. As such the only TSMC 10FF mobile products to date have been the low-volume MediaTek X30 and Apple 10X in summer and the high-volume Apple A11 and HiSilicon Kirin 970 in Q3-Q4, a 2-3 quarter after Samsung had entered high-volume production of the Snapdragon 835 and Exynos 8895.

HiSilicon’s expectations of the new process node are rather conservative improvement of only 20% in efficiency at the same performance point for the apples-to-apples CPU clusters, below ARM’s earlier predictions of 30%. This rather meagre improvement in power will be likely one of the reasons why HiSilicon decided not to increase the CPU clocks on the Kirin 970, instead focusing on bringing down power usage and lowering the TDP when compared to the Kirin 960.

The SoC does enjoy a healthy die size shrink from 117.72mm² down to 96.72mm² even though the new SoC has 50% more GPU cores as well as new IP blocks such as the NPU. Our colleagues at TechInsights have published a detailed per-block size comparison between the Kirin 960 and Kirin 970 and we see a 30-38% decrease in block size for apples-to-apples IP. The Cortex-A73 quad-core cluster now comes in at only 5.66mm², a metric to keep in mind and in stark contrast to Apple which is investing twice as much silicon area in its dual-core big CPU cluster.

DMCA.com Protection Status

Read More

The Samsung Exynos M3 - 6-wide Decode With 70%+ IPC Increase

The Samsung Exynos M3 - 6-wide Decode With 70%+ IPC Increase


The Exynos 9810 was one of the first big announcements for 2018 and it was quite an exciting one. Samsung’s claims of doubling single-threaded performance was definitely an eye-catching moment and got a lot of attention. The new SoC sports four of Samsung’s third-generation Exynos M3 custom architecture cores running at up to 2.9GHz, alongside four Cortex A55 cores at 1.9GHz.

Usually Samsung LSI’s advertised target frequency for the CPUs doesn’t necessarily mean that the mobile division will release devices with the CPU running at those frequencies. The Exynos 8890 was advertised by SLSI to run up to 2.7GHz, while the S7 limited it to 2.6GHz. The Exynos M2’s DVFS tables showed that the CPU could go up to 2.8GHz but was rather released with a lower and more power efficient 2.3GHz clock. Similarly, it’s very possible we might see more limited clocks on an eventual Galaxy S9 with the Exynos 9810.

Of course even accounting for the fact that part of Samsung’s performance increase claim for the Exynos 9810 comes from the clockspeed jump from 2.3GHz to 2.9GHz, that still leave a massive performance discrepancy towards the goal of doubling single-threaded performance. Thus, this performance delta must come from the microarchitectural changes. Indeed the effective IPC increase must be in the 55-60% range for the math to make sense.

With the public announcement of the Exynos 9810 having finally taken place, Samsung engineers are now free to release information on the new M3 CPU microarchitecture. One source of information that’s been invaluable over the years into digging into the deeper working of CPU µarch’s are the companies' own submissions to open-source projects such as the GCC and LLVM compilers. Luckily Samsung is a fantastic open-source contributor and has yesterday posted the first patches describing the machine model for the M3 microarchitecture.

To better visualise the difference between the previous microarchitectures and the new M3, we take a step back in time to have a look what the high-level pipeline configuration of the Exynos M1/M2:

 

 

At heart the Exynos M1 and M2 microarchitectures are based on a 4-wide in-order stage for decode and dispatch. The wide decode stage was rather unusual at the time as ARM’s own Cortex A72 and A73 architectures made due with respectively 3 and 2-wide instruction decoders. With the Exynos M1/M2 being Samsung LSI’s first in-house microarchitecture it’s possible that the front-end wasn’t as advanced as ARM’s, as the latter’s 2-wide A73 microarchitecture was more than able to keep up in terms of IPC against the 4-wide M1 & M2. Samsung’s back-end for the M1 and M2 included 9 execution ports:

  • Two simple ALU pipelines capable of integer additions.

  • A complex ALU handling simple operations as well as  integer multiplication and division.

  • A load unit port

  • A store unit port

  • Two branch prediction ports

  • Two floating point and vector operations ports leading to two mixed capability pipelines


The M1/M2 were effectively 9-wide dispatch and execution machines.  In comparison the A73 dispatches up to 8 micro-ops into 7 pipelines and the A75 dispatches up to 11 µops into 8 pipelines, keeping in mind that we’re talking about very different microarchitectures here and the execution capabilities between the pipelines differ greatly. From fetch to write-back, the M1/M2 had a pipeline depth of 13 stages which is 2 stages longer than that of the A73 and A75, resulting is worse branch-misprediction penalties.

This is only a rough overview of the M1/M2 cores, Samsung published a far more in depth microarchitectural overview at HotChips 2016 which we’ve covered here.

 

The Exynos M3 differs greatly from the M1/M2 as it completely overhauls the front-end and also widens the back-end. The M3 front-end fetch, decode, and rename stages now increases in width by 50% to accommodate a 6-wide decoder, making the new microarchitecture among one of the widest in the mobile space alongside Apple’s CPU cores.

This comes at a cost however, as some undisclosed stages in the front-end become longer by 2 cycles, increasing the minimum pipeline depth from fetch to writeback from 13 to 15 stages. To counteract this, Samsung must have improved the branch predictor, however we can’t confirm for sure what individual front-end stage improvements have been made. The reorder buffer on the rename stage has seen a massive increase from 96 entries to 228 entries, pointing out that Samsung is trying to vastly increase their ability to extract instruction level parallelism to feed their back-end execution units.

The depiction of the schedulers are my own best guess on how the M3 looks like, as it seemed to me like the natural progression from the M1 configuration. What we do know is that the core dispatches up to 12 µops into the schedulers and we have 12 execution ports:

  • Two simple ALU pipelines for integer additions, same as on the M1/M2.

  • Two complex ALUs handling simple integer additions and also multiplication and division. The doubling of the complex pipelines means that the M3 has now double the integer multiplication throughput compared to the M1/M2 and a 25% increase in simple integer arithmetic.

  • Two load units. Again, the M3 here doubles the load capabilities compared to the M1 and M2.

  • A store unit port, same as on the M1/M2.

  • Two branch prediction ports, likely the same setup as on the M1/M2, capable of feeding the two branches/cycle the branch prediction unit is able to complete.

  • Instead of 2 floating point and vector pipelines, the M3 now includes 3 of them, all of them capable of complex operations, theoretically vastly increasing FP throughput.


The simple ALU pipelines already operate at single-cycle latencies so naturally there’s not much room for improvement there. On the side of the complex pipelines we still see 4-cycle multiplications for 64-bit integers, however integer division has been greatly improved from 21 cycles down to 12 cycles. I’m not sure if the division unit reserves both complex pipelines or only one of them, but what is clear as mentioned before, integer multiplication execution throughput is doubled and the additional complex pipe also increases simple arithmetic throughput from 3 to 4 ADDs.

The load units have been doubled and their load latency remains 4 cycles for basic operations. The Store unit also doesn’t seem to change in terms of its 1-cycle latency for basic stores.

The floating point and vector pipelines have seen the most changes in the Exynos M3. There are 3 pipelines now with distributed capabilities between them. Simple FP arithmetic operations and multiplication see a three-fold increase in throughput as all pipelines now offer the capability, compared to only one for the Exynos M1/M2. Beyond tripling the throughput, the latency of FP additions and subtractions (FADD, FSUB) is reduced from 3 cycles down to 2 cycles. Multiplication stays at a 4-cycle latency.

Floating point division sees a doubling of the throughput as two of the three pipelines are now capable of the operations, and latency has also been reduced from 15 cycles down to 12 cycles. Cryptographic throughput of AES instruction doubles as well as two of the 3 pipelines are able to execute them. SHA instruction throughput remains the same. For simple vector operations we see a 50% increase in throughput due to the additional pipeline.



We’re only scratching the surface of what Samsung’s third-generation CPU microarchitecture is bringing to the table, but already one thing is clear: SLSI’s claim of doubling single-threaded performance does not seem farfetched at all. What I’ve covered here are only the high-level changes the in the pipeline configurations and we don’t know much at all about the improvements on the side of the memory subsystem. I’m still pretty sure that we’ll be looking at large increases in the cache sizes up to 512KB private L2’s for the cores with a large 4MB DSU L3. Given the floating point pipeline changes I’m also expecting massive gains for such workloads. The front-end of the M3 microarchitecture is still a mystery so here’s hoping that Samsung will be able to re-attend Hot Chips this year for a worthy follow-up presentation covering the new design.

With all of these performance improvements, it’s also expected that the power requirements of the core will be greatly beyond those of existing cores. This seems a natural explanation for the two-fold single-core performance increase while the multi-core improvement remains at 40% - running all cores of such a core design at full frequency would indeed showcase some very high TDP numbers.

If all these projections come to fruition, I have no idea how Samsung’s mobile division is planning to equalise the CPU performance between the Exynos 9810 and against an eventual Snapdragon 845 variant of the Galaxy S9, short of finding ourselves in a best-case scenario for ARM’s A75 vs a worst-case for the new Exynos M3. With 2 months to go, we’ll have to wait & see what both Samsung mobile and Samsung LSI have managed to cook up.
DMCA.com Protection Status
Read More