Use when implementing haptic feedback, Core Haptics patterns, audio-haptic synchronization, or debugging haptic issues - covers UIFeedbackGenerator, CHHapticEngine, AHAP patterns, and Apple's Causality-Harmony-Utility design principles from WWDC 2021
Inherits all available tools
Additional assets for this skill
This skill inherits all available tools. When active, it can use any tool Claude has access to.
Comprehensive guide to implementing haptic feedback on iOS. Every Apple Design Award winner uses excellent haptic feedback - Camera, Maps, Weather all use haptics masterfully to create delightful, responsive experiences.
Haptic feedback provides tactile confirmation of user actions and system events. When designed thoughtfully using the Causality-Harmony-Utility framework, haptics transform interfaces from functional to delightful.
This skill covers both simple haptics (UIFeedbackGenerator) and advanced custom patterns (Core Haptics), with real-world examples and audio-haptic synchronization techniques.
Apple's audio and haptic design teams established three core principles for multimodal feedback:
Problem: User can't tell what triggered the haptic Solution: Haptic timing must match the visual/interaction moment
Example from WWDC:
Code pattern:
// ✅ Immediate feedback on touch
@objc func buttonTapped() {
let generator = UIImpactFeedbackGenerator(style: .medium)
generator.impactOccurred() // Fire immediately
performAction()
}
// ❌ Delayed feedback loses causality
@objc func buttonTapped() {
performAction()
DispatchQueue.main.asyncAfter(deadline: .now() + 0.2) {
let generator = UIImpactFeedbackGenerator(style: .medium)
generator.impactOccurred() // Too late!
}
}
Problem: Visual, audio, and haptic don't match Solution: All three senses should feel like a unified experience
Example from WWDC:
Key insight: A large object should feel heavy, sound low and resonant, and look substantial. All three senses reinforce the same experience.
Problem: Haptics used everywhere "just because we can" Solution: Reserve haptics for significant moments that benefit the user
When to use haptics:
When NOT to use haptics:
For most apps, UIFeedbackGenerator provides 3 simple haptic types without custom patterns.
Physical collision or impact sensation.
Styles (ordered light → heavy):
.light - Small, delicate tap.medium - Standard tap (most common).heavy - Strong, solid impact.rigid - Firm, precise tap.soft - Gentle, cushioned tapUsage pattern:
class MyViewController: UIViewController {
let impactGenerator = UIImpactFeedbackGenerator(style: .medium)
override func viewDidLoad() {
super.viewDidLoad()
// Prepare reduces latency for next impact
impactGenerator.prepare()
}
@objc func userDidTap() {
impactGenerator.impactOccurred()
}
}
Intensity variation (iOS 13+):
// intensity: 0.0 (lightest) to 1.0 (strongest)
impactGenerator.impactOccurred(intensity: 0.5)
Common use cases:
.medium).light).heavy).rigid)Discrete selection changes (picker wheels, segmented controls).
Usage:
class PickerViewController: UIViewController {
let selectionGenerator = UISelectionFeedbackGenerator()
func pickerView(_ picker: UIPickerView, didSelectRow row: Int,
inComponent component: Int) {
selectionGenerator.selectionChanged()
}
}
Feels like: Clicking a physical wheel with detents
Common use cases:
System-level success/warning/error feedback.
Types:
.success - Task completed successfully.warning - Attention needed, but not critical.error - Critical error occurredUsage:
let notificationGenerator = UINotificationFeedbackGenerator()
func submitForm() {
// Validate form
if isValid {
notificationGenerator.notificationOccurred(.success)
saveData()
} else {
notificationGenerator.notificationOccurred(.error)
showValidationErrors()
}
}
Best practice: Match haptic type to user outcome
.success.error.warningCall prepare() before the haptic to reduce latency:
// ✅ Good - prepare before user action
@IBAction func buttonTouchDown(_ sender: UIButton) {
impactGenerator.prepare() // User's finger is down
}
@IBAction func buttonTouchUpInside(_ sender: UIButton) {
impactGenerator.impactOccurred() // Immediate haptic
}
// ❌ Bad - unprepared haptic may lag
@IBAction func buttonTapped(_ sender: UIButton) {
let generator = UIImpactFeedbackGenerator()
generator.impactOccurred() // May have 10-20ms delay
}
Prepare timing: System keeps engine ready for ~1 second after prepare().
For apps needing custom patterns, Core Haptics provides full control over haptic waveforms.
CHHapticEngine) - Link to the phone's actuatorCHHapticPatternPlayer) - Playback controlCHHapticPattern) - Collection of events over timeCHHapticEvent) - Building blocks specifying the experienceimport CoreHaptics
class HapticManager {
var engine: CHHapticEngine?
func initializeHaptics() {
// Check device support
guard CHHapticEngine.capabilitiesForHardware().supportsHaptics else {
print("Device doesn't support haptics")
return
}
do {
// Create engine
engine = try CHHapticEngine()
// Handle interruptions (calls, Siri, etc.)
engine?.stoppedHandler = { reason in
print("Engine stopped: \(reason)")
self.restartEngine()
}
// Handle reset (audio session changes)
engine?.resetHandler = {
print("Engine reset")
self.restartEngine()
}
// Start engine
try engine?.start()
} catch {
print("Failed to create haptic engine: \(error)")
}
}
func restartEngine() {
do {
try engine?.start()
} catch {
print("Failed to restart engine: \(error)")
}
}
}
Critical: Always set stoppedHandler and resetHandler to handle system interruptions.
Short, discrete feedback (like a tap).
let intensity = CHHapticEventParameter(
parameterID: .hapticIntensity,
value: 1.0 // 0.0 to 1.0
)
let sharpness = CHHapticEventParameter(
parameterID: .hapticSharpness,
value: 0.5 // 0.0 (dull) to 1.0 (sharp)
)
let event = CHHapticEvent(
eventType: .hapticTransient,
parameters: [intensity, sharpness],
relativeTime: 0.0 // Seconds from pattern start
)
Parameters:
hapticIntensity: Strength (0.0 = barely felt, 1.0 = maximum)hapticSharpness: Character (0.0 = dull thud, 1.0 = crisp snap)Sustained feedback over time (like a vibration motor).
let intensity = CHHapticEventParameter(
parameterID: .hapticIntensity,
value: 0.8
)
let sharpness = CHHapticEventParameter(
parameterID: .hapticSharpness,
value: 0.3
)
let event = CHHapticEvent(
eventType: .hapticContinuous,
parameters: [intensity, sharpness],
relativeTime: 0.0,
duration: 2.0 // Seconds
)
Use cases:
func playCustomPattern() {
// Create events
let tap1 = CHHapticEvent(
eventType: .hapticTransient,
parameters: [
CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.5),
CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.5)
],
relativeTime: 0.0
)
let tap2 = CHHapticEvent(
eventType: .hapticTransient,
parameters: [
CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.7),
CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.7)
],
relativeTime: 0.3
)
let tap3 = CHHapticEvent(
eventType: .hapticTransient,
parameters: [
CHHapticEventParameter(parameterID: .hapticIntensity, value: 1.0),
CHHapticEventParameter(parameterID: .hapticSharpness, value: 1.0)
],
relativeTime: 0.6
)
do {
// Create pattern from events
let pattern = try CHHapticPattern(
events: [tap1, tap2, tap3],
parameters: []
)
// Create player
let player = try engine?.makePlayer(with: pattern)
// Play
try player?.start(atTime: CHHapticTimeImmediate)
} catch {
print("Failed to play pattern: \(error)")
}
}
For continuous feedback (rolling textures, motors), use advanced player:
func startRollingTexture() {
let event = CHHapticEvent(
eventType: .hapticContinuous,
parameters: [
CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.4),
CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.2)
],
relativeTime: 0.0,
duration: 0.5
)
do {
let pattern = try CHHapticPattern(events: [event], parameters: [])
// Use advanced player for looping
let player = try engine?.makeAdvancedPlayer(with: pattern)
// Enable looping
try player?.loopEnabled = true
// Start
try player?.start(atTime: CHHapticTimeImmediate)
// Update intensity dynamically based on ball speed
updateTextureIntensity(player: player)
} catch {
print("Failed to start texture: \(error)")
}
}
func updateTextureIntensity(player: CHHapticAdvancedPatternPlayer?) {
let newIntensity = calculateIntensityFromBallSpeed()
let intensityParam = CHHapticDynamicParameter(
parameterID: .hapticIntensityControl,
value: newIntensity,
relativeTime: 0
)
try? player?.sendParameters([intensityParam], atTime: CHHapticTimeImmediate)
}
Key difference: CHHapticPatternPlayer plays once, CHHapticAdvancedPatternPlayer supports looping and dynamic parameter updates.
AHAP (Apple Haptic Audio Pattern) files are JSON files combining haptic events and audio.
{
"Version": 1.0,
"Metadata": {
"Project": "My App",
"Created": "2024-01-15"
},
"Pattern": [
{
"Event": {
"Time": 0.0,
"EventType": "HapticTransient",
"EventParameters": [
{
"ParameterID": "HapticIntensity",
"ParameterValue": 1.0
},
{
"ParameterID": "HapticSharpness",
"ParameterValue": 0.5
}
]
}
}
]
}
{
"Version": 1.0,
"Pattern": [
{
"Event": {
"Time": 0.0,
"EventType": "AudioCustom",
"EventParameters": [
{
"ParameterID": "AudioVolume",
"ParameterValue": 0.8
}
],
"EventWaveformPath": "ShieldA.wav"
}
},
{
"Event": {
"Time": 0.0,
"EventType": "HapticContinuous",
"EventDuration": 0.5,
"EventParameters": [
{
"ParameterID": "HapticIntensity",
"ParameterValue": 0.6
}
]
}
}
]
}
func loadAHAPPattern(named name: String) -> CHHapticPattern? {
guard let url = Bundle.main.url(forResource: name, withExtension: "ahap") else {
print("AHAP file not found")
return nil
}
do {
return try CHHapticPattern(contentsOf: url)
} catch {
print("Failed to load AHAP: \(error)")
return nil
}
}
// Usage
if let pattern = loadAHAPPattern(named: "ShieldTransient") {
let player = try? engine?.makePlayer(with: pattern)
try? player?.start(atTime: CHHapticTimeImmediate)
}
Example iteration: Shield initially used 3 transient pulses (haptic) + progressive continuous sound (audio) → no harmony. Solution: Switch to continuous haptic + ShieldA.wav audio → unified experience.
class ViewController: UIViewController {
let animationDuration: TimeInterval = 0.5
func performShieldTransformation() {
// Start haptic/audio simultaneously with animation
playShieldPattern()
UIView.animate(withDuration: animationDuration) {
self.shieldView.transform = CGAffineTransform(scaleX: 1.2, y: 1.2)
self.shieldView.alpha = 0.8
}
}
func playShieldPattern() {
if let pattern = loadAHAPPattern(named: "ShieldContinuous") {
let player = try? engine?.makePlayer(with: pattern)
try? player?.start(atTime: CHHapticTimeImmediate)
}
}
}
Critical: Fire haptic at the exact moment the visual change occurs, not before or after.
import AVFoundation
class AudioHapticCoordinator {
let audioPlayer: AVAudioPlayer
let hapticEngine: CHHapticEngine
func playCoordinatedExperience() {
// Prepare both systems
hapticEngine.notifyWhenPlayersFinished { _ in
return .stopEngine
}
// Start at exact same moment
let startTime = CACurrentMediaTime() + 0.05 // Small delay for sync
// Start audio
audioPlayer.play(atTime: startTime)
// Start haptic
if let pattern = loadAHAPPattern(named: "CoordinatedPattern") {
let player = try? hapticEngine.makePlayer(with: pattern)
try? player?.start(atTime: CHHapticTimeImmediate)
}
}
}
class HapticButton: UIButton {
let impactGenerator = UIImpactFeedbackGenerator(style: .medium)
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesBegan(touches, with: event)
impactGenerator.prepare()
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesEnded(touches, with: event)
impactGenerator.impactOccurred()
}
}
class HapticSlider: UISlider {
let selectionGenerator = UISelectionFeedbackGenerator()
var lastValue: Float = 0
@objc func valueChanged() {
let threshold: Float = 0.1
if abs(value - lastValue) >= threshold {
selectionGenerator.selectionChanged()
lastValue = value
}
}
}
class PullToRefreshController: UIViewController {
let impactGenerator = UIImpactFeedbackGenerator(style: .medium)
var isRefreshing = false
func scrollViewDidScroll(_ scrollView: UIScrollView) {
let threshold: CGFloat = -100
let offset = scrollView.contentOffset.y
if offset <= threshold && !isRefreshing {
impactGenerator.impactOccurred()
isRefreshing = true
beginRefresh()
}
}
}
func handleServerResponse(_ result: Result<Data, Error>) {
let notificationGenerator = UINotificationFeedbackGenerator()
switch result {
case .success:
notificationGenerator.notificationOccurred(.success)
showSuccessMessage()
case .failure:
notificationGenerator.notificationOccurred(.error)
showErrorAlert()
}
}
Haptics DO NOT work in Simulator. You will see:
Solution: Always test on physical device (iPhone 8 or newer).
func playHaptic() {
#if DEBUG
print("🔔 Playing haptic - Engine running: \(engine?.currentTime ?? -1)")
#endif
do {
let player = try engine?.makePlayer(with: pattern)
try player?.start(atTime: CHHapticTimeImmediate)
#if DEBUG
print("✅ Haptic started successfully")
#endif
} catch {
#if DEBUG
print("❌ Haptic failed: \(error.localizedDescription)")
#endif
}
}
Symptom: CHHapticEngine.start() throws error
Causes:
Solution:
func safelyStartEngine() {
guard CHHapticEngine.capabilitiesForHardware().supportsHaptics else {
print("Device doesn't support haptics")
return
}
do {
try engine?.start()
} catch {
print("Engine start failed: \(error)")
// Fall back to UIFeedbackGenerator
useFallbackHaptics()
}
}
Symptom: Code runs but no haptic felt on device
Debug steps:
Symptom: Audio plays but haptic delayed or vice versa
Causes:
prepare() before hapticSolution:
// ✅ Synchronized start
func playCoordinated() {
impactGenerator.prepare() // Reduce latency
// Start both simultaneously
audioPlayer.play()
impactGenerator.impactOccurred()
}
Symptom: AHAP pattern fails to load or play
Cause: Audio file > 4.2 MB or > 23 seconds
Solution: Keep audio files small and short. Use compressed formats (AAC) and trim to essential duration.