In a new interview, Tim Millet, Apple's Vice President of Platform Architecture, offers some clues about the new A14 Bionic processor, the importance of machine learning, and how Apple continues to distance itself from the competition.
The new iPad Air was announced at Apple's September event, and inside the device the new A14 Bionic processor could prove to be a huge step up from the previous generation.
According to Apple, the A14 Bionic processor offers a 30% increase in CPU performance, while the new quad-core graphics architecture also assumes 30% acceleration, compared to the A12 Bionic that was built into the iPad Air 3. Yes Compared to the A13, metrics suggest that the A14 offers a 19% improvement in CPU performance and a 27% improvement in graphics performance.
In an interview (in German) with Stern magazine, Apple's vice president of platform architecture Tim Millet offers some clues as to what makes the A14 Bionic processor special.
Millet explains that while Apple did not invent machine learning and the neural engines - "the foundation for this came from many decades ago" - they helped find ways to speed up the process.
Machine learning requires neural networks that must be trained on complex data systems, which, until recently, did not exist. As [data] storage capacity increased, machines could benefit from larger data packets, but the learning process remained relatively slow. However, in the early 2010s everything started to change.
In 2017, when the iPhone X was launched, which was the first iPhone with Face ID, we will discover that the face authentication process was carried out thanks to the A11 chip, capable of processing 600 billion arithmetic operations per second.
The 14-nanometer A5 Bionic chip, which will debut on the iPad Air (available for purchase in October), can calculate more than eighteen times that number of operations… up to 11 trillion per second!
"We are excited about the emergence of machine learning and the way it enables a new class of devices," says Millet in his interview with Stern. "I gasp when I see what people can do with the A14 Bionic chip."
Of course, it's not just about the hardware when it comes to performance. Millet also notes that Apple hardware developers are in a unique position to work alongside software development teams.
Together, developers can make sure they are creating code that everyone can use.
“We work closely with the software team during development to ensure that we are not just creating a piece of technology that is useful to a few. We want to make sure that thousands upon thousands of iOS developers will be able to do something with it. "
In the interview, he highlights the importance of Core ML, the basic machine learning framework that is often used for language processing, image analysis, and more. Apple has made Core ML easier for developers by allowing them to use machine learning in their apps.
"Core ML is a fantastic opportunity for anyone who wants to understand and find out what options there are," says Millet. “It took us a long time to make sure we didn't put any transistors on the unused chip. We want everyone to be able to access it ".
Stern notes that Core ML is a core component of the DJ Djay app. It is also used by Adobe.
Finally, Tim Millet talks about the problem that Face ID is incompatible with the masks that the coronavirus pandemic has forced to wear around the world. He says that while Apple could, in theory, make Face ID work by wearing a mask, it probably wouldn't. Covering your face deletes the data the iPhone uses to confirm that it's really you, and doing so greatly increases the chances of Face ID being tricked.
"It's hard to see something you can't see," says Tim Millet. “The facial recognition models are very good, but it's a difficult problem to solve. People want convenience, but they also want security. And Apple is committed to keeping your data safe ».
report this ad