The smartphone is a marvel of miniaturisation. Each one has multiple sensors, allowing it to determine all kinds of things about the world and its place within it. Where it’s located. Which way up it is. How fast it’s moving. Some of them can recognise your fingerprints, your face and the words you’re speaking. This allows them to entertain us, keep us healthy, help us find our way home and much else besides. But a growing branch of science is devoted to pooling information generated by the world’s 3.5 billion smartphones for the common good. The latest example, launched by Google last week, is an earthquake predictor. Install it on your Android phone and you’ll become part of a worldwide experiment in seismology. By detecting early tremors, the software can help Google get early warnings out to the public. The technology was never designed for this purpose, but science is harnessing it regardless. One of the first academics to work in this field was Eiman Kanjo, an associate professor at Nottingham Trent University in the UK. She gave the practice a name – “mobile sensing” – and has worked on multiple projects over the years, from detecting pollution levels and environmental noise to more recent efforts to battle the spread of Covid-19. “More than 15 years ago we were working with more primitive phones which didn’t even have GPS or Bluetooth,” she says. “We immediately spotted the opportunity with smartphones. The beauty of it was these multiple phones collecting data simultaneously, and us being able to watch that data dynamically changing in real time. It was fascinating. When phones added new sensors, such as GPS and accelerometers, we had a party. Each one brings new possibilities.” Gyroscopes understand how our phones are oriented, switching displays from landscape to portrait mode and enabling augmented reality apps. Magnetometers measure magnetic fields, not only transforming phones into compasses, but also elementary metal detectors. Cameras and biometric sensors allow the smartphone to see, microphones allow it to hear. Other recent additions include light scanners, barometers and movement detectors. But it’s the accelerometer, the sensor used to detect the phone’s movement, that is key to Google’s new experiment. "We figured out [Android phones are] sensitive enough to detect earthquake waves," said Google's Marc Stogaitis in an interview with <em>The Verge</em>. "They usually see both key types of waves, the P wave [primary] and the S wave [secondary]. Each phone is able to detect that something like an earthquake is happening, but you need an aggregate of phones to know that for sure." In a 2017 documentary entitled <em>The Crowd & the Cloud</em>, the co-founder of an environmental project in California, Brian Beveridge, gave an insight into the power of mobile sensing. “We sometimes frighten the statisticians, because they would prefer to have pristine data from a $1 million (Dh3.67m) machine,” he says. “But if you have a million $100 machines, all spewing data into the cloud, then we can start to adjust behaviour immediately.” Google understood that while seismometers are expensive, phones are relatively cheap and very common. When data combined from those phones is analysed, anomalies can be filtered out and the location and strength of quakes can be pinpointed. The company still stresses, however, that it’s still just an experiment for the time being. “The basic concept has been proven to work,” says Kanjo. “But they can't risk it and say yes, we know the answer. The main issue is always scalability.” This problem has recently been seen in attempts to use apps to contain the spread of Covid-19. In small-scale experiments, track-and-trace can be shown to work. But in the real world, there’s a fear of it either failing to detect cases, or generating false positive results which can have a huge impact on society. Fortunately, the stakes aren’t so high in the majority of mobile sensing applications. Millions of us see it at work every day as navigation apps alert us to the movement of road traffic. Smart cities use mobile sensing to change the way streets are lit, the way people park cars and how pollution is measured. There’s also a big crossover with so-called “citizen science”; smartphones helping people get involved with everything from tracking asthma attacks to monitoring mosquitos, from observing cloud patterns to saving the rainforests. Only two things stand in the way of this science. One is the restrictions put in place by the likes of Google and Apple on the way sensors can be used. “We have suffered a lot as developers because of these restrictions,” says Kanjo. “It’s not always easy to develop ideas when companies are in control.” The other is the attitude of the public to their phones being used in this way; passive data sharing has gained a somewhat toxic reputation in recent years. “The ethics are often exaggerated,” says Kanjo. “Right now, your phone company is being told that me and you are talking, and that’s far more intrusive than something like Covid contact tracing.” Google’s earthquake app, according to Stogaitis, de-identifies the information: “We don’t need to know anything about the person that’s sending it.” Participation in mobile sensing experiments, in fact, could almost be seen as a noble act, and certainly casts a new light on the smartphone; no longer a device centred completely on the individual, but one that can help the world understand a little more about itself.