Manual gestures
RNGH2 finally brings one of the most requested features: manual gestures and touch events. To demonstrate how to make a manual gesture we will make a simple one that tracks all pointers on the screen.
interface Pointer {
visible: boolean;
x: number;
y: number;
}
import { StyleSheet } from 'react-native';
import Animated, {
useAnimatedStyle,
useSharedValue,
} from 'react-native-reanimated';
function PointerElement(props: {
pointer: Animated.SharedValue<Pointer>,
active: Animated.SharedValue<boolean>,
}) {
const animatedStyle = useAnimatedStyle(() => ({
transform: [
{ translateX: props.pointer.value.x },
{ translateY: props.pointer.value.y },
{
scale:
(props.pointer.value.visible ? 1 : 0) *
(props.active.value ? 1.3 : 1),
},
],
backgroundColor: props.active.value ? 'red' : 'blue',
}));
return <Animated.View style={[styles.pointer, animatedStyle]} />;
}
// ...
const styles = StyleSheet.create({
pointer: {
width: 60,
height: 60,
borderRadius: 30,
backgroundColor: 'red',
position: 'absolute',
marginStart: -30,
marginTop: -30,
},
});
import { Gesture, GestureDetector } from 'react-native-gesture-handler';
export default function Example() {
const trackedPointers: Animated.SharedValue<Pointer>[] = [];
const active = useSharedValue(false);
for (let i = 0; i < 12; i++) {
trackedPointers[i] =
useSharedValue <
Pointer >
{
visible: false,
x: 0,
y: 0,
};
}
const gesture = Gesture.Manual();
return (
<GestureDetector gesture={gesture}>
<Animated.View style={{ flex: 1 }}>
{trackedPointers.map((pointer, index) => (
<PointerElement pointer={pointer} active={active} key={index} />
))}
</Animated.View>
</GestureDetector>
);
}
const gesture = Gesture.Manual().onTouchesDown((e, manager) => {
for (const touch of e.changedTouches) {
trackedPointers[touch.id].value = {
visible: true,
x: touch.x,
y: touch.y,
};
}
if (e.numberOfTouches >= 2) {
manager.activate();
}
});
const gesture = Gesture.Manual()
...
.onTouchesMove((e, _manager) => {
for (const touch of e.changedTouches) {
trackedPointers[touch.id].value = {
visible: true,
x: touch.x,
y: touch.y,
};
}
})
const gesture = Gesture.Manual()
...
.onTouchesUp((e, manager) => {
for (const touch of e.changedTouches) {
trackedPointers[touch.id].value = {
visible: false,
x: touch.x,
y: touch.y,
};
}
if (e.numberOfTouches === 0) {
manager.end();
}
})
const gesture = Gesture.Manual()
...
.onStart(() => {
active.value = true;
})
.onEnd(() => {
active.value = false;
});
And that's all! As you can see using manual gestures is really easy but as you can imagine, manual gestures are a powerful tool that makes it possible to accomplish things that were previously impossible with RNGH.
Modifying existing gestures
While manual gestures open great possibilities we are aware that reimplementing pinch or rotation from scratch just because you need to activate in specific circumstances or require position of the fingers, would be a waste of time as those gestures are already available. Therefore, you can use touch events with every gesture to extract more detailed information about the gesture than what the basic events alone provide. We also added a manualActivation
modifier on all continuous gestures, which prevents the gesture it is applied to from activating automatically, giving you full control over its behavior.
This functionality makes another highly requested feature possible: drag after long press. Simply set manualActivation
to true
on a PanGesture
and use StateManager
to fail the gesture if the user attempts to drag the component sooner than the duration of the long press.