iOS/Sources/App/WebView/WebRTC/WebRTCVideoPlayerViewControls.swift
Copilot cd86bc3ade
Add audio support to WebRTC video player with mute/unmute controls (#4074)
## Summary

Enables audio playback in the native WebRTC video player and adds
mute/unmute controls. The previous implementation had audio disabled via
`useManualAudio = true` to avoid microphone permission requests.

**Key changes:**

- **Audio session configuration**: Uses
`AVAudioSession.Category.playback` for receive-only audio (no microphone
access required)
- **Remote audio track management**: Properly extracts audio track from
peer connection transceivers after remote SDP is set
- **Mute/unmute UI**: Added button at bottom-left of player using SF
Symbols (`speaker.wave.fill` / `speaker.slash.fill`)
- **State synchronization**: ViewModel mute state always reflects actual
audio track state

**Technical details:**

```swift
// Audio session configured for playback only
audioSession.setCategory(AVAudioSession.Category.playback.rawValue)
audioSession.setMode(AVAudioSession.Mode.spokenAudio.rawValue)

// Remote audio track extracted after SDP negotiation
remoteAudioTrack = peerConnection.transceivers
    .first(where: { $0.mediaType == .audio })?
    .receiver.track as? RTCAudioTrack
```

The implementation removes microphone-related warnings from the
experimental disclaimer.

## Screenshots

N/A - Audio feature, no visual changes beyond the mute button icon which
follows existing control patterns.

## Link to pull request in Documentation repository

Documentation: home-assistant/companion.home-assistant#

## Any other notes

Audio mode uses `.spokenAudio` for optimal voice/camera audio playback.
The mute button follows the same auto-hide behavior as other player
controls (3-second timeout).

<!-- START COPILOT CODING AGENT SUFFIX -->



<!-- START COPILOT ORIGINAL PROMPT -->



<details>

<summary>Original prompt</summary>

> Let's start supporting audio on WebRTCVideoPlayerView, including
control mute/unmute


</details>



<!-- START COPILOT CODING AGENT TIPS -->
---

 Let Copilot coding agent [set things up for
you](https://github.com/home-assistant/iOS/issues/new?title=+Set+up+Copilot+instructions&body=Configure%20instructions%20for%20this%20repository%20as%20documented%20in%20%5BBest%20practices%20for%20Copilot%20coding%20agent%20in%20your%20repository%5D%28https://gh.io/copilot-coding-agent-tips%29%2E%0A%0A%3COnboard%20this%20repo%3E&assignees=copilot)
— coding agent works faster and does higher quality work when set up for
your repo.

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: bgoncal <5808343+bgoncal@users.noreply.github.com>
2025-12-10 16:34:30 +00:00

69 lines
1.9 KiB
Swift

import SFSafeSymbols
import Shared
import SwiftUI
struct WebRTCVideoPlayerViewControls: View {
let close: () -> Void
let isMuted: Bool
let toggleMute: () -> Void
var body: some View {
ZStack {
VStack {
HStack {
Spacer()
topButtons
}
Spacer()
Text(L10n.WebRTCPlayer.Experimental.disclaimer)
.font(DesignSystem.Font.footnote.weight(.light))
.foregroundStyle(.white)
.frame(maxWidth: .infinity, alignment: .leading)
.multilineTextAlignment(.leading)
}
.padding()
}
.background(
LinearGradient(
colors: [
.black, .clear, .black,
], startPoint: .top, endPoint: .bottom
)
.opacity(0.5)
)
}
@ViewBuilder
private var topButtons: some View {
Button(action: toggleMute) {
Image(systemSymbol: isMuted ? .speakerSlashFill : .speakerWave3)
.resizable()
.frame(width: 16, height: 16)
.foregroundStyle(.white)
.padding(DesignSystem.Spaces.oneAndHalf)
.modify { view in
if #available(iOS 26.0, *) {
view.glassEffect(.clear.interactive(), in: .circle)
} else {
view
.background(Color.black.opacity(0.5))
.clipShape(Circle())
}
}
}
.buttonStyle(.plain)
ModalCloseButton(tint: .white) {
close()
}
.padding(16)
}
}
#Preview {
WebRTCVideoPlayerViewControls(
close: {},
isMuted: false,
toggleMute: {}
)
}