Tim Cook Talks Daily Routine, Apple in Australia, and More in Interview
Ten years after being named CEO of Apple, Tim Cook recently sat down for an interview with The Australian Financial Review — his first with Australian media.
During the interview, Cook talked about his daily routine, how the Apple CEO makes it a point to read as many emails from customers as he possibly can every single morning, and how he touches base with a plethora of the company’s divisions throughout his work week.
Cook went on to exclaim his excitement for the future of Artificial Intelligence, as well as Augmented Reality, and make a point of Apple’s emphasis on privacy and security.
With 600,000 Australian developers registered with Apple to create apps for its ever-growing ecosphere of Operating Systems, Cook sees Australia as a perfect breeding ground for tech. Today, Apple commands 56% of Australia’s mobile market.
The Apple CEO also had some choice words for Facebook and CEO Mark Zuckerberg, whom Cook and Apple have been at odds with for a while. Apple and Facebook recently butted heads over Apple’s new App Tracking Transparency framework, which delivered a massive blow to Facebook’s ad-targeting practices.
An interconnected ecosystem of companies and data brokers, of purveyors of fake news and peddlers of division, of trackers and hucksters looking to make a quick buck is more present in our lives than it has ever been.
On the subject of Apple denying entities like Australian banks access to proprietary hardware such as the iPhone’s NFC chips for ‘tap and pay’ services, Cook interestingly said the following about placing backdoors in secure systems:
It’s the reality. If you put back doors in a system, anybody can use a backdoor. And so you have to make sure the system itself is robust and durable; otherwise you can see what happens in the security world. Every day you read about a breach, or you read about a ransomware.
The argument could be made that Apple’s new Child Sexual Abuse Material detection system will place a backdoor into an otherwise secure system, scanning minors’ encrypted iMessage conversations for nudity, and media stored on adults’ devices for images classified as CSAM.