There are now two main operating systems that have dominated the mobile space, Android & iOS. There used to be a third, Windows but Microsoft announced it would no longer be manufacturing mobile devices in 2019. You of course know how to use your mobile device or maybe even develop applications for it, but do you know how it works?
In this article, you will learn how Android OS and iOS work today. You will discover the different layers of architecture that enable applications to access hardware sensors like GPS through code. The difference between the two operating systems is extensive but they generally accomplish the same goals; Provide users with super-human capabilities using hardware components in an energy-efficient secure manner.

Android
Android applications can be written in either C++, Java, or Kotlin with Kotlin being Google’s preferred language. These Object-Oriented Programming (OOP) languages are used to create Android binary files in .abb format which is the file for a mobile app. Mobile apps run in the Android Run Time (ART) or for older devices Dalvik Virtual Machine (DVM).

ART is a managed runtime that is compatible with DEX bytecode that ART translates into native machine code. The old DVM used Just-in-Time (JIT) compilation to execute mobile application code as the application is running. Whereas ART uses Ahead-of-Time AOT compilation to compile the code before the application runs. When Android launched Android 7 (Nougat+) the engineers made changes to ART so that it started using AOT + JIT. The JIT compiler compliments ART’s current AOT compiler and improves runtime performance. The combination of ART with JIT allows devices to save storage and speed up app and system updates.
This is a form of profile-guided compilation that allows ART to manage the compilation of each app according to its real-world usage. The ART maintains a profile for each mobile application’s most used code and will precompile and cache that code to achieve the best compilation performance.
The Android operating system is an open-source operating system built on top of Linux kernels. A kernel is a program at the core of the device’s OS that has complete control over everything in the system. These kernels provide access to native hardware sensors on the device such as Bluetooth, GPS, accelerometers, push notifications, alarms and much more using Services. These services enable the communication between device hardware features and applications with the help of something called the Binder IPC in the Androids kernel.
Binder IPC
The Binder Inter-Process Communication (Binder IPC) allows mobile applications to work with system services. It is sourced from the Linux kernel, but the Android OS does not run from Linux IPC mechanisms. This is because the binder avoids unnecessary allocation of space that other IPC mechanisms do not. This is what enables high-level frameworks such as Java or Kotlin to communicate with Android system services.
Android System Service
System services are the programs that do the grunt work of talking to hardware and interacting with drivers in the system server. There is only one instance of any given service. So, a Bluetooth service or location service; there is only one instance of each process on the device. Every application that wants to interact with a service does so by using a Manager.
When you think about multiple applications running on an Android device, each application has its own Managers running inside a service. So, there may be multiple Managers accessing an alarm service but there is only ever one instance of an alarm service.
In Android, using frameworks such as Java or Kotlin you can access these Managers via Context.getSystemService(NAME_OF_SERVICE). Pretty much every top-level component that you interact with either is a Context or has access to a Context. A Context can be just about anything in Android, such as an Activity or service. They are used just about everywhere in an Android framework, so they are readily available.
A top-level context such as an Activity is actually an implementation of something called a context wrapper. The components that you would see in the context wrapper for an Activity is something called the core ContextImpl and its implementation is the same for whichever context you are using.
The ContextImpl has access to something called the SystemServiceRegistry. This registry is where all the different Manager classes are registered. These individual Manager instances are loaded lazily and cached within your applications process. The first time your application calls the LocationManager for example, the framework will create an instance of a manager as a Singleton and it will be registered in the SystemServiceRegistry..

HAL
The Hardware Abstraction Layer (HAL) is what allows the system services to communicate directly with the Linux kernel. Because all Android devices are not created equally, the components in the HAL are designed to be device-specific. If you’ve ever seen a .so file before then you’ve seen a HAL component built-in C/C++. These files work as a bridge between the Android framework and the Linux kernal.
Linux is designed to only handle system calls, while the Android framework communicates with the underlying hardware through Java APIs rather than system calls. When an Android application wants to access a system service to communicate with a Linux device driver then it does so using a HAL .so module. The HAL generates system calls that will be understood by the Linux kernel which bridges the communication from mobile app to a device driver.

Linux Kernel
Every operating system has a kernel that manages CPU and memory resources as well as the processes running on a computer. The Linux kernel in Android allows applications to interact with device drivers such as USB, WIFI, audio, Bluetooth, camera, and so on. In Android, when you launch an application it’s the kernel that starts the process for your app lives in and enables the app to be loaded from the flash into memory.
Under the hood, Linux is just a kernel, and this kernel has programs to make it usable by everyday users. Android uses a Linux kernel that is similar to the Linux kernels on a desktop, however, it only grabs the libraries it needs using static linking.

The iPhone Operating System (iOS) was created by Apple to support iPhones, iPads, and iPod Touch. You can use Objective-C or Swift programming language to create iOS App Store Package (.ipa) binary file that can be deployed to iOS devices. iOS defines a collection of instruments and technologies in a sequence of layers within its operating system. There are four layers in iOS that allow applications to interact with a device and its components.

When we develop iOS apps the device does not allow us direct access to its hardware. All interactions with hardware components are actually taking place using the different intermediaries that make up the iPhones anatomy. These intermediary libraries live in one of the four layers of iOS; Cocoa Touch, Media layer, Core Services, & Core OS. Each layer has its own frameworks, SDKs, and kits that it manages to provide a fluid mobile experience. Apple is a more secure operating system than Android as it cannot be rooted, and mobile apps cannot be side-loaded to the device. Side-loading is when a mobile app is deployed to a device via USB without signing any form of a certificate.
Cocoa Touch Layer
The Cocoa Touch layer is the outermost layer of the iPhone operating system and contains the most common features that developers will need access to. This layer is mainly written in Objective-C and was extended from the OS X Cocoa API to meet the requirements of iOS devices. The Cocoa Touch layer provides access to several popular software kits and services, with the UIKit being the most important of all.
The UIKit framework is a feature-rich programming interface built in Objective-C and it’s by far the most widely used kit within iOS. It’s responsible for rendering native UI such as buttons, text boxes, lists, and it also manages the application lifecycle.
Another member of the Cocoa Touch layer is GameKit. GameKit is a comprehensive framework that provides the peer-to-peer messaging and voice connectivity that enables multi-player gaming abilities in an iOS app. The GameKit communicates with the devices Game Center application to give the app social-gaming network features such as leaderboards rankings, real-time and turn-based multiplayer networking, and more to enhance gaming competitiveness for its users.
If you’ve ever used a driving-based application such as Lyft, DoorDash, Waze, then odds are you have interacted with MapKit. MapKit provides map UI that can be customized with routes, pin markers, and geographics that correspond to the device’s location.
The MessagesUI is another Cocoa Touch utility kit that provides classes that empower developers with everything that they need to handle emails. Message receivers, email subject, and body can be passed from applications that interact with its native email messaging UI.
The Apple Push Notification Service (APNS) is a very important service that enables push notifications to display when they are pushed server-side. When an application receives a push notification it will display an alert in the device’s heads-up display. The attributes of a push notification are based on the JSON payload it receives that is sent from an API that your support.
Media Layer
The next layer of the iOS apple is the Media Layer. As you may have guessed the media layer is responsible for rendering high-quality video, audio, and beautiful 2D and 3D animations. Just like the Cocoa Touch layer, the Media Layer is a collection of software kits and services but its objective is to assist with displaying audio and visuals to the user.
Thanks to the creation of some pretty savvy APIs, the media layer can support over 100 different media formats. It is made up of powerful animation frameworks that allow different media mediums that provide a wonderful 2D and 3D experience.
The ULKit is a fundamental player in the media layer that provides support for images and animating views. Both the ULKit and the Core Images framework work together to provide support for handling images. The ULKit library can also documents, draw images, return device information, handle text management, search capabilities, accessibility features, and asset management.
Core Graphics is an SDK that you have used in the past without realizing it. It is the native drawing engine for iOS that handles path-based drawing and 2D rendering using Quartz technology. This SDK assists with SVG and pixel-based image rendering.
The Core Animation software kit is designed to optimize the animation code. Its purpose is to create high frame rates with smooth animations while trying not to hinder CPU usage. It is the mechanism that creates animation loops that redraw images to create an animation.
Do you like listening to music on your iPhone? Well, then you’ve probably used the Media Player framework to do so. The Media Player API is embedded in another SDK, MusicKit, and it allows applications to search and play for music within the iTunes library.
AVKit is a powerful SDK is that makes the playback of audio and video possible. You can also capture and record your own audio and video using functions within the AVKit.
The GLKit is used to manage the 2D and 3D rendering used by hardware-accelerated interfaces. Its purpose is to speed up OpenGL ES/OpenGL app development features such as texture loading and shaders.
Core Services Layer
The #1 most advanced iOS SDKs within iOS is found within the Core Services layer of the operating system. Within core services, you will find SDK’s that are used to create the super-powers of our cellular devices that allow us to be superhuman. The ability to instantly know and share location, check health metrics, reach out to others across the globe are capabilities we would’ve thought were only supernatural a century ago.
The GPS system of your devices resides in the Core Services layer of iOS. The Core Location Framework provides real-time global coordinates of your device thanks to satellites in space and GPS receivers. Your iPhone has other forms of location tracking such as cellular tower pings and Wi-Fi data that can be accessed from this API.
You’re running down a trail with your iPhone and you finally stop to rest after a few miles. You open an application that uses HealthKit and you can quickly see how far you’ve run, the number of stairs climbed, and how many calories you’ve burned. The HealthKit has been an explosive technology that allows mobile applications to work as a nurse and to provide useful health analytics with no appointment necessary.
Within the Core Services layer lives the AddressUI framework. This framework allows applications to retrieve, edit, and create contacts in the iPhones address book. It provides access to ‘people-pickers’ that can be easily customized to the theme of your application. Search elements are available to allow applications to quickly query the Address Book database and much more.
The CloudKit provides the capabilities for moving data between your mobile app and iCloud. CloudKit ensures that the information of each application on the user’s device is isolated so there are no worries of data entanglement. This utility helps sync some of the application data when a user switches to a new iOS device.
Core Data is used to store your app cache for offline access. It works with CloudKit to send the stored application data to iCloud during times of synchronization.
Foundation Framework is built in Objective-C and provides a huge assortment of functionalities such as text processing, data and time calculations, sorting and filtering, and networking abilities.
Another key ingredient to the Core Services recipe is the Core Data Framework. It was originally created to assist with the transition from macOS-to-macOS X. It is a C-based wrapper for the Foundation library which is built in Objective-C.
Did you know that when your iPhone is falling it uses accelerometers to become aware of its descendants and set itself in a safe mode to prevent damage? This is made possible with the Core Motion Framework. Core Motion does much more than that as it hooks into gyroscopes, accelerometers, pedometers, magnetometers, and barometers to provide motion-related detection capabilities.
“Hey Siri, can turn on the lights”. The HomeKit ecosystem works with smart home devices that are connected to the internet such as thermostats, window blinds, smart light bulbs, and more. It is Apple’s answer to how to connect your iOS devices to IoT devices.
Social Framework is an API that is used to connect to social networks such as Facebook, Twitter, or SinaWeibo for China. If these companies end up standing the test of time, then you can use this API to connect to them. I hope to see support for Truth Social in the future releases of iOS.
StoreKit is a powerful API used for In-App purchases, advertisements, Apple music, and App Store recommendations and reviews. We all want our apps to make us some cheddar and the store kit provides us a secure library to make purchases via the Apple app stores bank account for the user. Reviews are extremely important when it comes to being competitive in the dog-eat-dog world of mobile development. You wouldn’t go to a dentist if you read a terrible review for them. These app reviews have much power within the community.
Core OS
The Core OS of iOS devices is responsible for communicating with essential hardware sensors. The frameworks contained in the Core OS layer support 64-bit which enables mobile applications to run faster.

One of my favorite technologies is found in the Core OS layer. The Bluetooth framework was invented by IBM in the 1950’s and was named after an ancient Viking, King Harald Bluetooth. Even the Bluetooth Symbol is a combination of the two Nordic letters, H + B.
This layer also contains a Security Services framework that is used to secure passwords, Apple account authentication, secure payments, and privacy protection for its users.
Accelerate is a cornerstone API within the Core OS layer of the iOS Apple. This framework is not responsible for managing Accelerometers its purpose is to accelerate and optimize the mathematical and image computations with the goal of improving energy consumption.
You can use the External Accessory framework to set up and control connections to MFI accessories. If you were to create a new IoT device using a microcontroller like Arduino you will find that you cannot connect to your new creation without an MFI certification. It is possible to obtain an MFI license from Apple to verify your device and allow the External Accessory framework in iOS to connect to it.
Face ID and Touch ID are used to enable secure access to applications using the Local Authorization framework. To ensure the security of your identity applications never gain access to your biometric data they only have knowledge of if the authentication was successful or not.
So, we’ve covered the basics of the 4 layers of iOS; Cocoa Touch, Media Layer, Core Services, and Core OS. Each layer has its own SDK’s that work together to provide applications access to different technologies and sensors on the device. This operating system is much different from Android as it categorizes its features into SDK’s where Android manages these types of technologies through Managers. iOS is constantly evolving and the SDK’s are sometimes moved and refactored however their objectives remain the same.
Summary
The interworks of the two mobile operating systems is quite complex and this is just a high-level overview. Android provides access to hardware using Managers againts Services, while iOS provides access using kits in different layers of the operating system. There is a lot going on under the screen that you may have not realized and designing everything to function in an energy-efficient manner must have been a challange. I hope you enjoyed reading this article and learning more about mobile architectures, thanks for listening!