CM3050 Topic 08: Sensor Programming
Main Info
Title: Sensor Programming
Teachers: Joe McAlister
Semester Taken: April 2022
Parent Module: cm3050: Mobile Development
Description
In this module, we learn about the sensors available to us on mobile devices, and how we can use them with React Native.
Key Reading
Lecture Summaries
8.0 Introduction to Sensors
A sensor is a device that detects or measures a physical property and responds to it.
Examples of sensors:
Accelerometers
Gyroscopes
Ambient Light Sensors
Cameras
We can classify three main categories of sensor:
Motion Sensors
These sensors measure acceleration forces and rotational forces along three axes.
Environment Sensors
These measure various environmental parameters, such as ambient air temperature and pressure, illumination and humidity.
Position Sensors
These measure the physical position of a device, eg orientation sensors, magnetometers.
When using sensor check the following - what is the need? You need to avoid a tech demo feel. Think, if I didn’t use the sensor how would I do it? If all the answers are complex, then maybe a sensor is the way to go.
Also think about accessibility - some sensor motions might be hard for categories of users.
There may be visual changes based on assistive technology, eg inverted colours, high contrast, bolded text or button shapes.
Remember that sensor quality differs over devices, new iPhones will be better than old ones. How do you guarantee a consistent experience across devices? Try testing across devices.
8.1 The Camera
RN simplifies camera access for us. The library is expo-camera
.
You have to ask user permission, do with Camera.requestPermissionsAsync()
- only required once at the moment. If they deny us access, they will need to go into settings to change it. Returns a promise.
you can embed it in the useEffect
hook.
Once we have permission we can display within a Camera
component. The contents of which will be the camera frame. We provide a type prop like this <Camera type={Camera.Constants.Type.back}>
You can call a method to get the available types.
You can style the view, and use props to control the camera:
flashMode
controls the camera flash. Use Camera.Constants.FlashMode
, when on the flash will fire when a photo is taken. When off it will not, Auto lets the camera decide.
autoFocus
controls autofocus - Camera.Constants.AutoFocus
, when on camera will autofocus, when off it will lock focus to the state it was in when the mode was turned off. Adjusted on some devices with focusDepth prop.
zoom
float between 0.0 and 1.0 - from minimum to maximum zoom.
whiteBalance
controls the white balance, auto, sunny, cloudy, shadow, fluorescent, incandescent.
focusDepth
the distance to plane of the sharpest focus, float between 0.0 and 1.0 where 0 is infinity focus, 1 is as close as possible. For Android, only available when useCamera2Api is true.
ratio
Android only, sets screen aspect ratio.
pictureSize
string representing the size of the pictures the takePicture will take, available sizes can be fetched with getAvailablePictureSizesAsync
.
onCameraReady
callback invoked when camera preview is set.
onFacesDetected
callback invoked with results of face detection on the preview. See FaceDetector
docs for details.
onBarCodeScanned
callback for when a bar code has been scanned, callback provided with an object of the shape {type: <type>, data: <data>}
where type is the type of the bar code scanned, and data is the encoded information in that code.
8.2 Haptics
Haptics and vibrations provide immediate feedback and notifications. Like the silent phone vibration.
Haptics are popular on Apple devices. EG Apple Watch crown simulating a physical click.
Haptics are precise, tap-like vibrations.
Vibrations are entire-device vibrations.
haptics are iOS only, vibrations are iOS and Android compatible.
if you use the haptics library in expo, it will fallback to vibration on Android. recommended approach.
Avoid overusing haptics, they can become tiresome. Avoid designing experiences around extended haptics. Test them. Allow different options so people an turn them off.
Haptics may not work eg if user has disabled them system-wise, or if low power mode is on. So don’t depend on them.
we can import the expo-haptics
library after install as follows:
import * as Haptics from 'expo-haptics';
Haptics.impactAsync(style);
//style has the following options
Haptics.ImpactFeedbackStyle.{Light|Medium|Heavy}
Haptics.notificationAsync(type)
//notification has the following options
Haptics.NotificationFeedbackType.{Success|Warning|Error}
8.3 GPS
Global Positioning System (GPS) is formed of > 30 satellites circling Earth. The devices detect signal from the satellites, and the time taken to receive them is used to position us. It takes at least 4 satellites to create a precise location (4th determines altitude).
Satellites act like the stars in constellations - we know where they are supposed to be at any given time. Gives position to c. 2m.
More modern tech can position within inches, but not in consumer products yet.
GPS uses a LOT of power. Can annoy users.
It requires consent, people can and will deny you access. iOS let you provide an approximate location, within 10s of metres rather than accurate position.
we can use expo-location
to subscribe to location updates, poll for location etc.
Permission is similar to camera.
once you’ve got permission you can get the current location. It’s not a fast process - it polls the location in real time.
You can call getLastKnownPositionAsync instead to get a faster, but less accurate, last known position.
You can use other helper functions:
Location.geocodeAsync(address)
takes address information and turns into lat and long.
Location.reverseGeocodeAsync(location)
the reverse. Use sparingly, very computationally taxing.
Location.startLocationUpdatesAsync
receive updates when location changes. Needs to be reviewed on Android as it’s privacy sensitive.
Location.startGeofencingAsync(taskName, regions)
receive updates when a user enters or leaves a geofenced area.
8.4 Acceleromters
A sensor that measures the acceleration forces acting on an object. It handles flipping the orientation of your phone. Often combined with a gyroscope to provide more info by tracking rotation and twist.
It can be used as an input device, a data capture device (tracking steps), a shortcut (shake to undo).
Always consider accessibility.
To use it you add a listener to the Accelerometer.