Lately I played with Accelerometer Sensor which basically reads your current gravitational forces in three dimensions. But how can this extra information be useful to your device? I’ll show you using to the Pokemon app I’m making.
Accelerometer
The x, y, z
reading from the accelerometer (from GY-521) is kind of interesting, especially after the readings are calibrated.
AccelX:1.09, AccelY:-0.41, AccelZ:8.98
The biggest number turns out to be the z
, which says it’s facing upward in z
direction and it should be close to 9.8
in theory. Bingo, physics!.
When you rotate your device in 3D space, you will soon find out any of the direction can face the ground, for instance the following are couple of readings for different device orientation:
AccelX:-0.41 ,AccelY:-0.45 ,AccelZ:9.00 FACE UP
AccelX:-0.36 ,AccelY:-0.57 ,AccelZ:-10.78 FACE DOWN
AccelX:0.58 ,AccelY:-10.21 ,AccelZ:0.55 FACE ME: Vertical
AccelX:-9.42 ,AccelY:-0.39 ,AccelZ:0.73 FACE ME: Horizontal
AccelX:-0.26 ,AccelY:-8.10 ,AccelZ:6.80 PUT ON STAND
AccelX:2.99 ,AccelY:-4.50 ,AccelZ:12.41 PICK IT UP
If you put the device face down, you get the z
reading to be an negative value. And if you make the device face you, you will get a z
to be parallel to the gravitation, thus z
gets to be close to 0
.
I think you probably get the idea now. This is exactly how our phone knows when it’s facing down or in the portrait/landscape mode etc. You can also get other info such as suddenly picking up the phone. I was told that the phone relies on more than one sensor so that it can get more accurate picture of what the user is doing with the phone.
Alright, now we know what the accelerometer is, how can we put the data into use? Last time we have used the phone touch screen to drag the Pokemon around, let’s see if we can do something similar with the gravitational force.
Sensors
We need to first read the gravitation from the sensor library. Let’s install this library expo-sensors
, here’s the documentation.
import React, { useEffect } from "react";
import { Accelerometer } from "expo-sensors";
Accelerometer.setUpdateInterval(200);
export function Gravity() {
...
useEffect(() => {
const monitor = Accelerometer.addListener((gravity) => {
tilt.value = tilt.value - gravity.x * 50;
});
return () => {
monitor.remove();
};
}, []);
The Accelerometer
is a subscription service that will update us the acceleration forces at an interval. When we listen to it, we can update our tilt
value. We only used the x
part of the gravitational force in this example, basically this would track whether we are tilting the phone left to right. If we tilt the phone to the left, we increment the offset value a bit, and vice versa. When you are holding the phone, your hand is mostly steady, but it’ll also pick up the minor motion from the hand movement. That’s the tilt
value we’re tracking.
Animate
Once again, with the tilt
value given, we want to paint the feedback based on it. We’ll use react-native-reanimated
library, and here’s the documentation.
import Animated, {
useSharedValue,
useAnimatedStyle,
withSpring,
} from "react-native-reanimated";
export function Gravity({ children }) {
const tilt = useSharedValue(0);
...
const animatedStyles = useAnimatedStyle(() => {
return {
transform: [
{ perspective: 100 },
{
rotateY: withSpring(`${tilt.value}deg`),
},
],
};
});
return <Animated.View style={[animatedStyles]}>{children}</Animated.View>;
}
We first define the tilt
value via useSharedValue
which is a hook similar to useRef
that allows us to keep track of the value without coupling with the React render cycle, because we want to use the value to drive the phone device.
We then assemble a CSS style via a useAnimatedStyle
hook. This time we want to use the perspective transform of CSS, which provides a adhoc 3D sense to the underlying 2D content. Here we set perspective
to be 100
and then wire the rotateY
property with our tracked tilt
value. We also added a bit physics via withSpring
. If you are interested at what is the perspective, please read the documentation here.
With the styles provided, the last step is to wire it with the Animated.View
. Notice the children
prop, that is where we want to apply the gravity to any item. In our case, we put a Pokemon picture under the gravity operation. The following is the way to use the Gravity
component that we just built:
<GestureHandlerRootView style={{ flex: 1 }}>
<View style={styles.view}>
<View style={{ flex: 1 }}>
<Gravity>
<Pan>
<Image
style={styles.image}
source={{ uri: getPokemonImageSrc(pokemon.id) }}
/>
</Pan>
</Gravity>
</View>
</GestureHandlerRootView>
Now let’s take a look what we got. Notice we even put the Pan
that we built from the previous post.
Nice!
More sensors
As you might have guessed, the phone actually are equipped with quite a few sensors, such as Gyroscope, Barometer, Light sensor, or even a Pedometer. Gyroscope sensor is built in GY-521 as well, which tracks whether you rotates your devices in 3D. If I plug in that data instead of gravity, ex. wiring the gyroscopic rotateY to the Pokemon rotateY, I got similar and slightly smoother effect.