diff --git a/plugins/radar-android-phone-audio-input/README.md b/plugins/radar-android-phone-audio-input/README.md
new file mode 100644
index 000000000..9a6ec7003
--- /dev/null
+++ b/plugins/radar-android-phone-audio-input/README.md
@@ -0,0 +1,98 @@
+# RADAR PHONE AUDIO INPUT
+
+Plugin for recording uncompressed high-quality audio, utilizing low-level classes to directly interact with hardware, with capabilities for playback and audio input device selection.
+
+
+## Installation
+
+Include this plugin in a RADAR app by adding the following configuration to `build.gradle`:
+```gradle
+dependencies {
+ implementation "org.radarbase:radar-android-phone-audio-input:$radarCommonsAndroidVersion"
+}
+```
+Add the provider `org.radarbase.passive.phone.audio.input` to the `plugins` variable of the `RadarService` instance in your app.
+
+## Configuration
+
+To enable this plugin, add the provider `phone_audio_input` to `plugins` property of the configuration.
+
+This plugin takes the following Firebase configuration parameters:
+
+
+| Name | Type | Default | Description |
+|------------------------------------------|------|--------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| `phone-audio-input-audio-source` | int | `1` = MediaRecorder.AudioSource.MIC | Specifies the source of audio input for recording. The default value is 1, which corresponds to MediaRecorder.AudioSource.MIC. This means the application will use the device's microphone for audio input. |
+| `phone-audio-input-recorder-buffer-size` | int | `-1` | Defines the size of the buffer used by the audio recorder. The buffer is a temporary storage area for audio data before it is processed or saved. Setting this to -1 lets the system choose an optimal buffer size based on the current audio configuration. Adjusting this can impact audio latency and quality. |
+| `phone-audio-input-current-audio-format` | int | `2` = AudioFormat.ENCODING_PCM_16BIT | Determines the encoding format of the recorded audio. The default value 2 corresponds to AudioFormat.ENCODING_PCM_16BIT, which means the audio will be recorded in 16-bit Pulse Code Modulation (PCM). PCM is a common uncompressed audio format that provides high-quality sound. Other formats like 8-bit PCM can also be used, but they may reduce audio quality. Note that 8-bit encoding might not work on all devices, so 16-bit should be preferred |
+| `phone-audio-input-current-channel` | int | `0x10` = 16 | Specifies the number of audio channels to be used during recording. The default value 0x10 represents mono (1 channel), meaning audio will be recorded from a single channel. Using mono recording is typical for voice recording to save space and simplify processing. Stereo (2 channels) can also be used for higher quality audio that captures a sense of direction. |
+| `phone-audio-input-current-sample-rate` | int | `16000` | Defines the number of audio samples captured per second. The default value is 16000 Hz (16 kHz), which is a common sample rate for voice recording, balancing audio quality and file size. Higher sample rates, such as 44100 Hz (CD quality), provide better audio fidelity but result in larger file sizes. Lower sample rates may be used for lower quality or reduced file size needs. |
+
+
+
+This plugin produces data for the following topics: (types starts with `org.radarcns.passive.google` prefix)
+
+| Topic | Type | Description |
+|-----------------------------|-------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| `android_phone_audio_input` | `PhoneAudioInput` | This topic captures high-quality, uncompressed audio data recorded by the user's device. The PhoneAudioInput type includes detailed information about the audio recording, such as timestamps, file metadata, and device specifications. |
+
+## Workflow Description
+
+This plugin provides an interactive UI for recording and managing audio data. Below is a detailed description of how the plugin works:
+
+### Workflow
+
+1. **Plugin Action**:
+ - The plugin has an action that, when clicked, opens a user interface (UI) for interacting with the plugin.
+
+2. **Activity Launch**:
+ - The action opens the `PhoneAudioInputActivity`, which serves as the main interface for the plugin.
+
+3. **Manager Connection**:
+ - `PhoneAudioInputActivity` is connected to the `PhoneAudioInputManager` through interfaces defined in `PhoneAudioInputState`. This connection manages the state and interactions between the UI and the underlying audio recording functionalities.
+
+4. **Recording Capabilities**:
+ - The activity provides capabilities for recording audio, allowing users to start, stop, and manage audio recordings directly from the UI.
+
+5. **Device Selection Mechanism**:
+ - The device selection mechanism automatically prioritizes the USB audio device if connected. If no USB device is found, the plugin selects external devices in the following order of precedence:
+ - `TYPE_USB_DEVICE`
+ - `TYPE_WIRED_HEADSET`
+ - `TYPE_BLUETOOTH`
+ - If no external device is found, the plugin defaults to using the smartphone's built-in microphone.
+
+6. **Fragment Integration**:
+ - `PhoneAudioInputActivity` opens a fragment that provides additional capabilities, including:
+ - **Audio Playback**: Users can playback the recorded audio to review it before sending.
+ - **Data Sending**: Users can send the recorded audio data for storing to s3.
+
+### Guide for Using the PhoneAudioInput Plugin
+
+
+
+
+1. In the main activity, where plugins are visible along with their actions (if any), click on the specific action for the Phone Audio Input plugin.
+ - This will redirect you to the first activity of the Phone Audio Input plugin.
+
+
+
+
+2. At the top of this activity, there is a dropdown menu for selecting microphones.
+ - The current input routing microphone is shown by default in the dropdown menu when it is collapsed.
+ - Click on the refresh button near dropdown menu to refresh microphones.
+ - Select your preferred microphone from the dropdown menu. If no preferred microphone is selected, the automatic device selection logic will take place.
+
+
+
+
+3. Click Start Recording to begin recording audio.
+ - While recording, you can pause or resume the recording.
+ - Do not quit the recording activity while recording is in progress or paused, as this may cause the application to misbehave. An alert will also be shown if you attempt to quit the activity in a recording or paused state. You can safely quit the activity after stopping the recording.
+
+
+
+
+4. After stopping the recording, an alert dialog will appear with three options: play the recorded audio, send it directly without playing, or discard it.
+ - If you click Play, you will be redirected to a new fragment for listening to the last recorded audio. Here, you can review the recording and send it after listening.
+
+
diff --git a/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/action_devices.jpg b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/action_devices.jpg
new file mode 100644
index 000000000..889fc78c6
Binary files /dev/null and b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/action_devices.jpg differ
diff --git a/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/alert_rec_ongoing.jpg b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/alert_rec_ongoing.jpg
new file mode 100644
index 000000000..37bbdc60b
Binary files /dev/null and b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/alert_rec_ongoing.jpg differ
diff --git a/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/alert_rec_stopped.jpg b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/alert_rec_stopped.jpg
new file mode 100644
index 000000000..a96a0bb40
Binary files /dev/null and b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/alert_rec_stopped.jpg differ
diff --git a/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/playback_exit_alert.jpg b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/playback_exit_alert.jpg
new file mode 100644
index 000000000..c7d6989c4
Binary files /dev/null and b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/playback_exit_alert.jpg differ
diff --git a/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/playback_frag.jpg b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/playback_frag.jpg
new file mode 100644
index 000000000..8b71c15aa
Binary files /dev/null and b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/playback_frag.jpg differ
diff --git a/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/recording_activity.jpg b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/recording_activity.jpg
new file mode 100644
index 000000000..35e4fb833
Binary files /dev/null and b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/recording_activity.jpg differ
diff --git a/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/recording_activity_devices.jpg b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/recording_activity_devices.jpg
new file mode 100644
index 000000000..837988f17
Binary files /dev/null and b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/recording_activity_devices.jpg differ
diff --git a/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/recording_activity_paused.jpg b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/recording_activity_paused.jpg
new file mode 100644
index 000000000..912bcccff
Binary files /dev/null and b/plugins/radar-android-phone-audio-input/assets/images/audio_input_plugin/recording_activity_paused.jpg differ
diff --git a/plugins/radar-android-phone-audio-input/build.gradle b/plugins/radar-android-phone-audio-input/build.gradle
new file mode 100644
index 000000000..5e96e4444
--- /dev/null
+++ b/plugins/radar-android-phone-audio-input/build.gradle
@@ -0,0 +1,33 @@
+apply from: "$rootDir/gradle/android.gradle"
+
+android {
+ namespace "org.radarbase.passive.phone.audio.input"
+}
+
+//---------------------------------------------------------------------------//
+// Configuration //
+//---------------------------------------------------------------------------//
+
+description = "Plugin for recording uncompressed high-quality audio."
+
+//---------------------------------------------------------------------------//
+// Sources and classpath configurations //
+//---------------------------------------------------------------------------//
+android {
+ buildFeatures {
+ viewBinding true
+ }
+}
+
+dependencies {
+ api project(":radar-commons-android")
+ implementation "androidx.appcompat:appcompat:$appcompat_version"
+ implementation "com.google.android.material:material:$material_version"
+ implementation "androidx.activity:activity:1.9.0"
+ implementation "androidx.constraintlayout:constraintlayout:$constraintlayout_version"
+ implementation "androidx.legacy:legacy-support-v4:$legacy_support_version"
+ implementation 'androidx.fragment:fragment-ktx:1.8.1'
+}
+
+apply from: "$rootDir/gradle/publishing.gradle"
+apply plugin: 'org.jetbrains.kotlin.android'
\ No newline at end of file
diff --git a/plugins/radar-android-phone-audio-input/src/main/AndroidManifest.xml b/plugins/radar-android-phone-audio-input/src/main/AndroidManifest.xml
new file mode 100644
index 000000000..88a8e7bae
--- /dev/null
+++ b/plugins/radar-android-phone-audio-input/src/main/AndroidManifest.xml
@@ -0,0 +1,21 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/PhoneAudioInputManager.kt b/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/PhoneAudioInputManager.kt
new file mode 100644
index 000000000..e07522a01
--- /dev/null
+++ b/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/PhoneAudioInputManager.kt
@@ -0,0 +1,419 @@
+/*
+ * Copyright 2017 The Hyve
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.radarbase.passive.phone.audio.input
+
+import android.Manifest
+import android.content.Context
+import android.content.SharedPreferences
+import android.content.pm.PackageManager
+import android.media.AudioDeviceInfo
+import android.media.AudioDeviceInfo.TYPE_BLUETOOTH_A2DP
+import android.media.AudioDeviceInfo.TYPE_BLUETOOTH_SCO
+import android.media.AudioDeviceInfo.TYPE_USB_DEVICE
+import android.media.AudioDeviceInfo.TYPE_USB_HEADSET
+import android.media.AudioDeviceInfo.TYPE_WIRED_HEADSET
+import android.media.AudioFormat
+import android.media.AudioRecord
+import android.media.AudioRecord.STATE_INITIALIZED
+import android.os.Handler
+import android.os.Looper
+import android.os.Process
+import android.widget.Toast
+import androidx.core.content.ContextCompat
+import org.radarbase.android.data.DataCache
+import org.radarbase.android.source.AbstractSourceManager
+import org.radarbase.android.source.SourceStatusListener
+import org.radarbase.android.util.Boast
+import org.radarbase.android.util.SafeHandler
+import org.radarbase.passive.phone.audio.input.PhoneAudioInputService.Companion.LAST_RECORDED_AUDIO_FILE
+import org.radarbase.passive.phone.audio.input.PhoneAudioInputService.Companion.PHONE_AUDIO_INPUT_SHARED_PREFS
+import org.radarbase.passive.phone.audio.input.utils.AudioDeviceUtils
+import org.radarbase.passive.phone.audio.input.utils.AudioTypeFormatUtil
+import org.radarbase.passive.phone.audio.input.utils.AudioTypeFormatUtil.toLogFriendlyType
+import org.radarcns.kafka.ObservationKey
+import org.radarcns.passive.phone.PhoneAudioInput
+import org.slf4j.Logger
+import org.slf4j.LoggerFactory
+import java.io.File
+import java.io.RandomAccessFile
+
+class PhoneAudioInputManager(service: PhoneAudioInputService) : AbstractSourceManager(service), PhoneAudioInputState.AudioRecordManager, PhoneAudioInputState.AudioRecordingManager {
+ private val audioInputTopic: DataCache = createCache("android_phone_audio_input", PhoneAudioInput())
+
+ private var audioRecord: AudioRecord? = null
+ private var randomAccessWriter: RandomAccessFile? = null
+ private var buffer: ByteArray = byteArrayOf()
+ private val audioRecordingHandler = SafeHandler.getInstance(
+ "PHONE-AUDIO-INPUT", Process.THREAD_PRIORITY_BACKGROUND)
+ private val recordProcessingHandler: SafeHandler = SafeHandler.getInstance(
+ "AUDIO-RECORD-PROCESSING", Process.THREAD_PRIORITY_AUDIO)
+ private val mainHandler = Handler(Looper.getMainLooper())
+ private val preferences: SharedPreferences =
+ service.getSharedPreferences(PHONE_AUDIO_INPUT_SHARED_PREFS, Context.MODE_PRIVATE)
+
+ var audioSource: Int
+ get() = state.audioSource.get()
+ set(value) { state.audioSource.set(value) }
+ var sampleRate: Int
+ get() = state.sampleRate.get()
+ set(value) { state.sampleRate.set(value) }
+ var channel: Int
+ get() = state.channel.get()
+ set(value) { state.channel.set(value) }
+ var audioFormat: Int
+ get() = state.audioFormat.get()
+ set(value) { state.audioFormat.set(value) }
+ /**
+ * The total number of bytes needed to hold the audio data for one `framePeriod`.
+ */
+ var bufferSize: Int
+ get() = state.bufferSize.get()
+ set(value) { state.bufferSize.set(value) }
+
+ /**
+ * The framePeriod is calculated as the number of samples in `TIMER_INTERVAL` milliseconds.
+ * It represents how many samples correspond to the given interval.
+ */
+ private var framePeriod: Int
+ private var bitsPerSample: Short
+ private var numChannels: Short
+ private val audioDir: File?
+ private var recordingFile: File? = null
+ private var payloadSize: Int = 0
+ @Volatile
+ private var currentlyRecording: Boolean = false
+
+ init {
+ name = service.getString(R.string.phone_audio_input_display_name)
+ bitsPerSample = if (audioFormat == AudioFormat.ENCODING_PCM_16BIT) 16 else 8
+ numChannels = if (channel == AudioFormat.CHANNEL_IN_MONO) 1 else 2
+ framePeriod = sampleRate * TIMER_INTERVAL/1000
+
+ val internalDirs = service.filesDir
+ status = if (internalDirs != null) {
+ audioDir = File(internalDirs, "org.radarbase.passive.phone.audio.input")
+ val dirCreated = audioDir.mkdirs()
+ val directoryExists = audioDir.exists()
+ logger.debug("Directory for saving audio file, created: {}, exists: {}", dirCreated, directoryExists)
+ clearAudioDirectory()
+ SourceStatusListener.Status.READY
+ } else {
+ audioDir = null
+ SourceStatusListener.Status.UNAVAILABLE
+ }
+ }
+
+ override fun start(acceptableIds: Set) {
+ register()
+ audioRecordingHandler.start()
+ recordProcessingHandler.start()
+ createRecorder()
+ state.audioRecordManager = this
+ state.audioRecordingManager = this
+ }
+
+ private val setPreferredDeviceAndUpdate: (AudioDeviceInfo) -> Unit = { microphone ->
+ audioRecord?.preferredDevice = microphone
+ state.finalizedMicrophone.postValue(audioRecord?.preferredDevice)
+ }
+
+
+ private fun createRecorder() {
+ audioRecordingHandler.execute {
+ if (ContextCompat.checkSelfPermission(service, Manifest.permission.RECORD_AUDIO) ==
+ PackageManager.PERMISSION_GRANTED
+ ) {
+ status = SourceStatusListener.Status.CONNECTING
+ // If using default values: framePeriod = 16000 * 0.12 = 1920 samples.
+ framePeriod = sampleRate * TIMER_INTERVAL / 1000
+ // For default values: bufferSize = 1920 * 16 * 1 * 2 / 8 = 7680 bytes.
+ bufferSize = framePeriod * bitsPerSample * numChannels * 2 / 8
+ logger.info("Calculated buffer size: $bufferSize (bytes), and frame period: $framePeriod")
+
+ val calculatedBufferSize: Int = AudioRecord.getMinBufferSize(sampleRate, channel, audioFormat)
+ if (calculatedBufferSize != AudioRecord.ERROR_BAD_VALUE) {
+ if (bufferSize < calculatedBufferSize) {
+ bufferSize = calculatedBufferSize
+ framePeriod = bufferSize / (2 * bitsPerSample * numChannels / 8)
+ logger.info("Updating buffer size to: $bufferSize, and frame period to: $framePeriod")
+ }
+ buffer = ByteArray(framePeriod * bitsPerSample / 8 * numChannels)
+ try {
+ audioRecord = AudioRecord(
+ audioSource, sampleRate, channel, audioFormat, bufferSize
+ )
+ if (audioRecord?.state != STATE_INITIALIZED) {
+ disconnect()
+ } else if (audioRecord?.state == STATE_INITIALIZED) {
+ logger.info("Successfully initialized AudioRecord")
+ status = SourceStatusListener.Status.CONNECTED
+ mainHandler.post(::observeMicrophones)
+ }
+ } catch (ex: IllegalArgumentException) {
+ logger.error("Invalid parameters passed to AudioRecord constructor. ", ex)
+ } catch (ex: Exception) {
+ logger.error("Exception while initializing AudioRecord. ", ex)
+ }
+ } else {
+ logger.error("Error in calculating buffer size")
+ disconnect()
+ }
+ } else {
+ logger.error("Permission not granted for RECORD_AUDIO, disconnecting now")
+ disconnect()
+ }
+ }
+ }
+
+ override fun startRecording() {
+ startAudioRecording()
+ }
+
+ override fun pauseRecording() {
+ state.isPaused.value = true
+ }
+
+ override fun resumeRecording() {
+ state.isPaused.value = false
+ }
+
+ override fun stopRecording() {
+ stopAudioRecording()
+ }
+
+ override fun clear() {
+ clearAudioDirectory()
+ }
+
+ override fun send() {
+ state.finalizedMicrophone.value?.let { mic ->
+ recordingFile?.let { wavFile ->
+ var audioDuration: Long = -1L
+ try {
+ audioDuration = AudioDeviceUtils.getAudioDuration(wavFile) ?: throw RuntimeException("Can't retrieve the duration of audio file")
+ if (audioDuration == -1L) {
+ throw RuntimeException("Audio length not retrieved")
+ }
+ } catch (ex: Exception) {
+ logger.error("Cannot retrieve audio file duration. Discarding sending data")
+ }
+ send(
+ audioInputTopic, PhoneAudioInput(
+ currentTime, currentTime, "will be set after s3 functionality is added",
+ "after data sending to s3 is enabled", mic.productName.toString(), mic.id.toString(),
+ mic.sampleRates.joinToString(" "),
+ mic.encodings.joinToString(" ") {
+ AudioTypeFormatUtil.toLogFriendlyEncoding(it) },
+ mic.type.toLogFriendlyType(), mic.channelCounts.joinToString(" "), audioDuration, wavFile.length(),
+ state.isRecordingPlayed, wavFile.extension, sampleRate, AudioTypeFormatUtil.toLogFriendlyEncoding(audioFormat)
+ )
+ )
+ // Dummy Toast, will be removed after file uploading to s3 will be enabled.
+ Boast.makeText(service, "Sending last recorded audio. Is played? : ${state.isRecordingPlayed}", Toast.LENGTH_LONG).show()
+
+ }
+ }
+
+ // After the data is sent:
+ state.isRecordingPlayed = false
+ }
+
+ override fun setPreferredMicrophone(microphone: AudioDeviceInfo) {
+ state.microphonePrioritized = true
+ microphone.let(setPreferredDeviceAndUpdate)
+ }
+
+ private fun observeMicrophones() {
+ state.connectedMicrophones.observe(service) { connectedMicrophones ->
+ audioRecordingHandler.execute {
+ if (connectedMicrophones?.size == 0) {
+ logger.warn("No connected microphone")
+ }
+ if (state.microphonePrioritized && state.finalizedMicrophone.value !in connectedMicrophones) {
+ state.microphonePrioritized = false
+ logger.info("Microphone prioritized: false")
+ }
+ logger.info(
+ "PhoneAudioInputManager: Connected microphones: {}",
+ connectedMicrophones.map { it.productName })
+
+ if (!state.microphonePrioritized) {
+ connectedMicrophones.also(::runDeviceSelectionLogic)
+ } else {
+ state.finalizedMicrophone.value?.let(setPreferredDeviceAndUpdate)
+ }
+ }
+ }
+ }
+
+ private fun runDeviceSelectionLogic(connectedMicrophones: List) {
+ logger.info("Running device selection logic")
+ (arrayOf(TYPE_USB_DEVICE, TYPE_USB_HEADSET).let { deviceTypes ->
+ connectedMicrophones.run { preferByDeviceType(deviceTypes) }
+ }?: arrayOf(TYPE_WIRED_HEADSET).let { deviceTypes ->
+ connectedMicrophones.run { preferByDeviceType(deviceTypes) }
+ } ?: arrayOf(TYPE_BLUETOOTH_A2DP, TYPE_BLUETOOTH_SCO).let { deviceTypes ->
+ connectedMicrophones.run { preferByDeviceType(deviceTypes) }
+ } ?: connectedMicrophones.firstOrNull())?.also(setPreferredDeviceAndUpdate)
+ }
+
+ private fun List.preferByDeviceType(types: Array): AudioDeviceInfo? =
+ types.firstNotNullOfOrNull { deviceType ->
+ this.firstOrNull { deviceType == it.type }
+ }
+
+ private fun clearAudioDirectory() {
+ payloadSize = 0
+ audioDir?.let { audioDir ->
+ audioDir.parentFile
+ ?.list { _, name -> name.startsWith("phone_audio_input") && name.endsWith(".wav") }
+ ?.forEach {
+ File(audioDir.parentFile, it).delete()
+ logger.debug("Deleted audio file: {}", it)
+ }
+
+ audioDir.walk()
+ .filter { it.name.startsWith("phone_audio_input") && it.path.endsWith(".wav") }
+ .forEach {
+ it.delete()
+ logger.debug("Deleted file: {}", it)
+ }
+ }
+ }
+
+ private fun startAudioRecording() {
+ audioRecordingHandler.execute {
+ setupRecording()
+ if ((audioRecord?.state == STATE_INITIALIZED) && (recordingFile != null)) {
+ if (!state.microphonePrioritized) {
+ state.connectedMicrophones.value?.also(::runDeviceSelectionLogic)
+ } else {
+ state.finalizedMicrophone.value?.let(setPreferredDeviceAndUpdate)
+ }
+ audioRecord?.startRecording()
+ mainHandler.post { state.isRecording.value = true }
+ recordProcessingHandler.execute{
+ audioRecord?.read(buffer, 0, buffer.size)
+ currentlyRecording = true
+ state.finalizedMicrophone.postValue(audioRecord?.routedDevice)
+ logger.info("Finalized routed device: {}", state.finalizedMicrophone.value?.productName)
+ }
+ logger.trace("Started recording")
+ audioRecord?.setRecordPositionUpdateListener(updateListener)
+ audioRecord?.setPositionNotificationPeriod(framePeriod)
+ } else {
+ logger.error("Trying to start recording on uninitialized AudioRecord or filePath is null, state: ${audioRecord?.state}, $recordingFile")
+ disconnect()
+ }
+ }
+ }
+
+ private fun setupRecording() {
+ clearAudioDirectory()
+ setRecordingPath()
+ writeFileHeaders()
+ }
+
+ private fun setRecordingPath() {
+ recordingFile = File(audioDir, "phone_audio_input" + System.currentTimeMillis() + ".wav")
+
+ preferences.edit()
+ .putString(LAST_RECORDED_AUDIO_FILE, recordingFile!!.absolutePath)
+ .apply()
+ randomAccessWriter = RandomAccessFile(recordingFile, "rw")
+ }
+
+ private fun writeFileHeaders() {
+
+ randomAccessWriter?.apply {
+ setLength(0) // Set file length to 0, to prevent unexpected behavior in case the file already existed
+
+ val header = ByteArray(44)
+ AudioDeviceUtils.setWavHeaders(header, numChannels, sampleRate, bitsPerSample)
+ write(header, 0, 44)
+ }
+ }
+
+ private val updateListener = object : AudioRecord.OnRecordPositionUpdateListener {
+ override fun onMarkerReached(recorder: AudioRecord?) {
+ // No Action
+ }
+
+ override fun onPeriodicNotification(recorder: AudioRecord?) {
+ if (currentlyRecording && !state.isPaused.value!!) {
+ audioRecordingHandler.execute {
+ audioRecord?.let {
+ val dataRead = it.read(buffer, 0, buffer.size)
+ randomAccessWriter?.write(buffer)
+ payloadSize += dataRead
+ logger.debug("onPeriodicNotification: Recording Audio")
+ }
+ }
+ } else if (state.isPaused.value!!) {
+ // Triggering a dummy read to keep the callback active
+ audioRecordingHandler.execute { audioRecord?.read(buffer, 0, buffer.size) }
+ logger.debug("Callback: onPeriodicNotification: recording is paused.")
+ }
+ else {
+ logger.debug("Callback: onPeriodicNotification after recording is stopped.")
+ }
+ }
+ }
+
+ private fun stopAudioRecording() {
+ logger.debug("Stopping Recording: Saving data")
+ mainHandler.post { state.isRecording.value = false }
+ audioRecordingHandler.execute {
+ currentlyRecording = false
+ audioRecord?.apply {
+ setRecordPositionUpdateListener(null)
+ stop()
+ }
+ randomAccessWriter?.apply {
+ val chunkSize = 36 + payloadSize
+ seek(4)
+ write(chunkSize and 0xff)
+ write((chunkSize shr 8) and 0xff)
+ write((chunkSize shr 16) and 0xff)
+ write((chunkSize shr 24) and 0xff)
+
+ seek(40)
+ write(payloadSize and 0xff)
+ write((payloadSize shr 8) and 0xff)
+ write((payloadSize shr 16) and 0xff)
+ write((payloadSize shr 24) and 0xff)
+ close()
+ payloadSize = 0
+ }
+ }
+ }
+
+ override fun onClose() {
+ audioRecordingHandler.stop{
+ audioRecord?.release()
+ clearAudioDirectory()
+ }
+ recordProcessingHandler.stop()
+ }
+ companion object {
+ private val logger: Logger = LoggerFactory.getLogger(PhoneAudioInputManager::class.java)
+
+ /** The interval(ms) in which the recorded samples are output to the file */
+ private const val TIMER_INTERVAL = 120
+ }
+}
\ No newline at end of file
diff --git a/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/PhoneAudioInputProvider.kt b/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/PhoneAudioInputProvider.kt
new file mode 100644
index 000000000..99a010675
--- /dev/null
+++ b/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/PhoneAudioInputProvider.kt
@@ -0,0 +1,54 @@
+/*
+ * Copyright 2017 The Hyve
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.radarbase.passive.phone.audio.input
+
+import android.Manifest
+import android.content.Intent
+import org.radarbase.android.BuildConfig
+import org.radarbase.android.RadarService
+import org.radarbase.android.source.SourceProvider
+import org.radarbase.passive.phone.audio.input.ui.PhoneAudioInputActivity
+
+class PhoneAudioInputProvider(radarService: RadarService): SourceProvider(radarService) {
+
+ override val description: String
+ get() = radarService.getString(R.string.phone_audio_input_description)
+ override val pluginNames: List
+ get() = listOf(
+ "phone_audio_input",
+ "audio_input",
+ ".phone.PhoneAudioInputProvider",
+ "org.radarbase.passive.phone.audio.input.PhoneAudioInputProvider"
+
+ )
+ override val serviceClass: Class
+ get() = PhoneAudioInputService::class.java
+ override val displayName: String
+ get() = radarService.getString(R.string.phone_audio_input_display_name)
+ override val sourceProducer: String
+ get() = "ANDROID"
+ override val sourceModel: String
+ get() = "PHONE"
+ override val version: String
+ get() = BuildConfig.VERSION_NAME
+ override val permissionsNeeded: List
+ get() = listOf(Manifest.permission.RECORD_AUDIO)
+ override val actions: List
+ get() = listOf(Action(radarService.getString(R.string.startRecordingActivity)){
+ startActivity(Intent(this, PhoneAudioInputActivity::class.java))
+ })
+}
\ No newline at end of file
diff --git a/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/PhoneAudioInputService.kt b/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/PhoneAudioInputService.kt
new file mode 100644
index 000000000..2d497d596
--- /dev/null
+++ b/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/PhoneAudioInputService.kt
@@ -0,0 +1,60 @@
+/*
+ * Copyright 2017 The Hyve
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.radarbase.passive.phone.audio.input
+
+import android.media.AudioFormat
+import android.media.MediaRecorder
+import org.radarbase.android.config.SingleRadarConfiguration
+import org.radarbase.android.source.SourceManager
+import org.radarbase.android.source.SourceService
+
+class PhoneAudioInputService: SourceService() {
+
+ override val defaultState: PhoneAudioInputState
+ get() = PhoneAudioInputState()
+
+ override fun createSourceManager(): PhoneAudioInputManager = PhoneAudioInputManager(this)
+
+ override fun configureSourceManager(
+ manager: SourceManager,
+ config: SingleRadarConfiguration
+ ) {
+ manager as PhoneAudioInputManager
+ manager.audioSource = config.getInt(PHONE_AUDIO_INPUT_AUDIO_SOURCE, PHONE_AUDIO_INPUT_AUDIO_SOURCE_DEFAULT)
+ manager.bufferSize = config.getInt(PHONE_AUDIO_INPUT_RECORDER_BUFFER_SIZE, PHONE_AUDIO_INPUT_RECORDER_BUFFER_SIZE_DEFAULT)
+ manager.audioFormat = config.getInt(PHONE_AUDIO_INPUT_CURRENT_AUDIO_FORMAT, PHONE_AUDIO_INPUT_CURRENT_AUDIO_FORMAT_DEFAULT)
+ manager.channel = config.getInt(PHONE_AUDIO_INPUT_CURRENT_CHANNEL, PHONE_AUDIO_INPUT_CURRENT_CHANNEL_DEFAULT)
+ manager.sampleRate = config.getInt(PHONE_AUDIO_INPUT_CURRENT_SAMPLE_RATE, PHONE_AUDIO_INPUT_CURRENT_SAMPLE_RATE_DEFAULT)
+ }
+
+ companion object {
+ private const val PHONE_AUDIO_INPUT_PREFIX = "phone-audio-input-"
+ const val PHONE_AUDIO_INPUT_AUDIO_SOURCE = PHONE_AUDIO_INPUT_PREFIX + "audio-source"
+ const val PHONE_AUDIO_INPUT_RECORDER_BUFFER_SIZE = PHONE_AUDIO_INPUT_PREFIX + "recorder-buffer-size"
+ const val PHONE_AUDIO_INPUT_CURRENT_AUDIO_FORMAT = PHONE_AUDIO_INPUT_PREFIX + "current-audio-format"
+ const val PHONE_AUDIO_INPUT_CURRENT_CHANNEL = PHONE_AUDIO_INPUT_PREFIX + "current-channel"
+ const val PHONE_AUDIO_INPUT_CURRENT_SAMPLE_RATE = PHONE_AUDIO_INPUT_PREFIX + "current-sample-rate"
+ const val LAST_RECORDED_AUDIO_FILE = PHONE_AUDIO_INPUT_PREFIX + "last-recorded-audio-file"
+ const val PHONE_AUDIO_INPUT_SHARED_PREFS = PHONE_AUDIO_INPUT_PREFIX + "shared-prefs"
+
+ const val PHONE_AUDIO_INPUT_AUDIO_SOURCE_DEFAULT = MediaRecorder.AudioSource.MIC
+ const val PHONE_AUDIO_INPUT_RECORDER_BUFFER_SIZE_DEFAULT = -1
+ const val PHONE_AUDIO_INPUT_CURRENT_AUDIO_FORMAT_DEFAULT = AudioFormat.ENCODING_PCM_16BIT
+ const val PHONE_AUDIO_INPUT_CURRENT_CHANNEL_DEFAULT = AudioFormat.CHANNEL_IN_MONO
+ const val PHONE_AUDIO_INPUT_CURRENT_SAMPLE_RATE_DEFAULT = 16000
+ }
+}
\ No newline at end of file
diff --git a/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/PhoneAudioInputState.kt b/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/PhoneAudioInputState.kt
new file mode 100644
index 000000000..c4f563a0d
--- /dev/null
+++ b/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/PhoneAudioInputState.kt
@@ -0,0 +1,60 @@
+/*
+ * Copyright 2017 The Hyve
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.radarbase.passive.phone.audio.input
+
+import android.media.AudioDeviceInfo
+import android.media.AudioFormat
+import android.media.MediaRecorder
+import androidx.lifecycle.LiveData
+import androidx.lifecycle.MutableLiveData
+import org.radarbase.android.source.BaseSourceState
+import org.radarbase.passive.phone.audio.input.PhoneAudioInputService.Companion.PHONE_AUDIO_INPUT_CURRENT_SAMPLE_RATE_DEFAULT
+import org.radarbase.passive.phone.audio.input.PhoneAudioInputService.Companion.PHONE_AUDIO_INPUT_RECORDER_BUFFER_SIZE_DEFAULT
+import java.util.concurrent.atomic.AtomicInteger
+
+class PhoneAudioInputState: BaseSourceState() {
+
+ var audioRecordManager: AudioRecordManager? = null
+ var audioRecordingManager: AudioRecordingManager? = null
+ val isRecording: MutableLiveData = MutableLiveData(null)
+ val isPaused: MutableLiveData = MutableLiveData(false)
+ val connectedMicrophones: MutableLiveData> = MutableLiveData>(emptyList())
+ val finalizedMicrophone: MutableLiveData = MutableLiveData()
+ var microphonePrioritized:Boolean = false
+ var isRecordingPlayed: Boolean = false
+
+ var audioSource: AtomicInteger = AtomicInteger(MediaRecorder.AudioSource.MIC)
+ var sampleRate: AtomicInteger = AtomicInteger(PHONE_AUDIO_INPUT_CURRENT_SAMPLE_RATE_DEFAULT)
+ var channel: AtomicInteger = AtomicInteger(AudioFormat.CHANNEL_IN_MONO)
+ var audioFormat: AtomicInteger = AtomicInteger(AudioFormat.ENCODING_PCM_16BIT)
+ var bufferSize: AtomicInteger = AtomicInteger(PHONE_AUDIO_INPUT_RECORDER_BUFFER_SIZE_DEFAULT)
+
+
+ interface AudioRecordManager {
+ fun startRecording()
+ fun stopRecording()
+ fun pauseRecording()
+ fun resumeRecording()
+ fun clear()
+ fun setPreferredMicrophone(microphone: AudioDeviceInfo)
+ }
+
+ interface AudioRecordingManager {
+ fun send()
+ }
+
+}
\ No newline at end of file
diff --git a/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/ui/PhoneAudioInputActivity.kt b/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/ui/PhoneAudioInputActivity.kt
new file mode 100644
index 000000000..7d9f80513
--- /dev/null
+++ b/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/ui/PhoneAudioInputActivity.kt
@@ -0,0 +1,471 @@
+/*
+ * Copyright 2017 The Hyve
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.radarbase.passive.phone.audio.input.ui
+
+import android.content.ComponentName
+import android.content.Context
+import android.content.DialogInterface
+import android.content.Intent
+import android.content.ServiceConnection
+import android.content.SharedPreferences
+import android.media.AudioDeviceInfo
+import android.os.Bundle
+import android.os.Handler
+import android.os.IBinder
+import android.os.Looper
+import android.view.MotionEvent
+import android.view.View
+import android.view.ViewGroup
+import android.view.WindowManager
+import android.widget.AdapterView
+import android.widget.AdapterView.OnItemSelectedListener
+import android.widget.ArrayAdapter
+import android.widget.FrameLayout
+import android.widget.Spinner
+import android.widget.Toast
+import androidx.activity.OnBackPressedCallback
+import androidx.activity.enableEdgeToEdge
+import androidx.appcompat.app.AppCompatActivity
+import androidx.fragment.app.Fragment
+import androidx.fragment.app.commit
+import androidx.lifecycle.Observer
+import androidx.lifecycle.ViewModelProvider
+import org.radarbase.android.IRadarBinder
+import org.radarbase.android.RadarApplication.Companion.radarApp
+import org.radarbase.android.source.SourceStatusListener
+import org.radarbase.android.util.Boast
+import org.radarbase.passive.phone.audio.input.PhoneAudioInputProvider
+import org.radarbase.passive.phone.audio.input.PhoneAudioInputService.Companion.LAST_RECORDED_AUDIO_FILE
+import org.radarbase.passive.phone.audio.input.PhoneAudioInputService.Companion.PHONE_AUDIO_INPUT_SHARED_PREFS
+import org.radarbase.passive.phone.audio.input.PhoneAudioInputState
+import org.radarbase.passive.phone.audio.input.R
+import org.radarbase.passive.phone.audio.input.databinding.ActivityPhoneAudioInputBinding
+import org.radarbase.passive.phone.audio.input.utils.AudioDeviceUtils
+import org.slf4j.Logger
+import org.slf4j.LoggerFactory
+
+class PhoneAudioInputActivity : AppCompatActivity() {
+
+ private lateinit var binding: ActivityPhoneAudioInputBinding
+ private lateinit var spinner: Spinner
+ private var adapter: ArrayAdapter? = null
+
+ private var isUserInitiatedSection: Boolean = false
+ private var isRecording: Boolean = false
+ private val lastRecordedAudioFile: String?
+ get() = preferences?.getString(LAST_RECORDED_AUDIO_FILE, null)
+ private var previousDevice: String? = null
+ private var addStateToVM: (() -> Unit)? = null
+ private var postNullState: (() -> Unit)? = null
+ private var viewModelInitializer: (() -> Unit)? = null
+
+ private val mainHandler: Handler = Handler(Looper.getMainLooper())
+ private var recorderProvider: PhoneAudioInputProvider? = null
+ private var audioInputViewModel: PhoneAudioInputViewModel? = null
+ private val state: PhoneAudioInputState?
+ get() = recorderProvider?.connection?.sourceState
+ private var preferences: SharedPreferences? = null
+ private val microphones: MutableList = mutableListOf()
+
+ private val radarServiceConnection = object : ServiceConnection{
+ override fun onServiceConnected(name: ComponentName?, service: IBinder?) {
+ logger.debug("Service bound to PhoneAudioInputActivity")
+ val radarService = service as IRadarBinder
+ recorderProvider = null
+ for (provider in radarService.connections) {
+ if (provider is PhoneAudioInputProvider) {
+ recorderProvider = provider
+ }
+ }
+ if (state == null) {
+ logger.info("Cannot set the microphone state is null")
+ Boast.makeText(this@PhoneAudioInputActivity,
+ R.string.unable_to_record_toast, Toast.LENGTH_SHORT).show(true)
+ return
+ }
+ state?.let {
+ it.isRecording.observe(this@PhoneAudioInputActivity, isRecordingObserver)
+ it.finalizedMicrophone.observe(this@PhoneAudioInputActivity, currentMicrophoneObserver)
+ addStateToVM = {
+ audioInputViewModel?.phoneAudioState?.postValue(it)
+ }
+ mainHandler.postDelayed (addStateToVM!!, 500)
+ }
+ refreshInputDevices()
+ }
+
+ override fun onServiceDisconnected(name: ComponentName?) {
+ logger.debug("Service unbound from PhoneAudiInputActivity")
+ state?.let {
+ it.isRecording.removeObserver(isRecordingObserver)
+ it.finalizedMicrophone.removeObserver(currentMicrophoneObserver)
+ postNullState = {
+ audioInputViewModel?.phoneAudioState?.postValue(null)
+ }
+ mainHandler.postDelayed (postNullState!!, 500)
+ }
+ recorderProvider = null
+ }
+ }
+
+ private val isRecordingObserver: Observer = Observer { isRecording: Boolean? ->
+ isRecording ?: return@Observer
+ if (isRecording) {
+ this@PhoneAudioInputActivity.isRecording = true
+ logger.debug("Switching to Stop Recording mode")
+ audioInputViewModel?.startTimer()
+ window.addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON)
+ onRecordingViewUpdate()
+ } else {
+ this@PhoneAudioInputActivity.isRecording = false
+ logger.debug("Switching to Start Recording mode")
+ audioInputViewModel?.stopTimer()
+ window.clearFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON)
+ notRecordingViewUpdate()
+ }
+ }
+
+ private val currentMicrophoneObserver: Observer = Observer { microphone: AudioDeviceInfo? ->
+ microphone?.productName?.toString()?.let { productName ->
+ adapter?.apply {
+ val deviceName = getPosition(productName)
+ spinner.setSelection(deviceName)
+ }
+ }
+ }
+
+ private val recordTimeObserver: Observer = Observer{elapsedTime ->
+ binding.tvRecordTimer.text = elapsedTime
+ }
+
+ private fun getAudioDeviceByPosition(position: Int): AudioDeviceInfo? =
+ (state?.connectedMicrophones?.value)?.get(position)
+
+ private val setPreferredMicrophone: String.(Int) -> Unit = { pos: Int ->
+ pos.let(::getAudioDeviceByPosition)?.also {
+ if (it.productName != this) return@also
+ }?.run {
+ state?.audioRecordManager?.setPreferredMicrophone(this)
+ Boast.makeText(this@PhoneAudioInputActivity, getString(
+ R.string.input_audio_device,
+ productName), Toast.LENGTH_SHORT).show(true)
+ }
+ }
+
+ override fun onCreate(savedInstanceState: Bundle?) {
+ super.onCreate(savedInstanceState)
+ enableEdgeToEdge()
+ binding = ActivityPhoneAudioInputBinding.inflate(layoutInflater)
+ setContentView(binding.root)
+ spinner = binding.spinnerSelectDevice
+ preferences = getSharedPreferences(PHONE_AUDIO_INPUT_SHARED_PREFS, Context.MODE_PRIVATE)
+ createDropDown()
+ disableButtonsInitially()
+
+ intent?.let {
+ if (it.hasExtra(EXTERNAL_DEVICE_NAME)) {
+ previousDevice = it.getStringExtra(EXTERNAL_DEVICE_NAME)
+ logger.debug("Previously preferred device: {}", previousDevice)
+ }
+ }
+
+ viewModelInitializer = {
+ audioInputViewModel = ViewModelProvider(this)[PhoneAudioInputViewModel::class.java].apply {
+ elapsedTime.observe(this@PhoneAudioInputActivity, recordTimeObserver)
+ }
+ logger.trace("Making buttons visible now")
+ if (isRecording) {
+ onRecordingViewUpdate()
+ } else {
+ notRecordingViewUpdate()
+ }
+ }
+ mainHandler.postDelayed (viewModelInitializer!!, 500)
+ manageBackPress()
+ }
+
+ private fun manageBackPress() {
+ this.onBackPressedDispatcher.addCallback(this, object: OnBackPressedCallback(true) {
+ override fun handleOnBackPressed() {
+ if (isRecording) {
+ AudioDeviceUtils.showAlertDialog(this@PhoneAudioInputActivity) {
+ setTitle(getString(R.string.cannot_close_activity))
+ .setMessage(getString(R.string.cannot_close_activity_message))
+ .setNeutralButton(getString(R.string.ok)) { dialog: DialogInterface, _ ->
+ dialog.dismiss()
+ }
+ }
+ } else {
+ isEnabled = false
+ this@PhoneAudioInputActivity.onBackPressedDispatcher.onBackPressed()
+ }
+ }
+ })
+ }
+
+ private fun createDropDown() {
+ spinner.setOnTouchListener { v, event ->
+ isUserInitiatedSection = true
+ if (event.action == MotionEvent.ACTION_UP) {
+ v.performClick()
+ }
+ false
+ }
+ spinner.onItemSelectedListener = object: OnItemSelectedListener {
+ override fun onItemSelected(parent: AdapterView<*>?, view: View?, position: Int, id: Long) {
+ if (isUserInitiatedSection) {
+ parent?.getItemAtPosition(position).toString().apply { setPreferredMicrophone(position) }
+ isUserInitiatedSection = false
+ }
+ }
+
+ override fun onNothingSelected(parent: AdapterView<*>?) {
+ // No Action
+ }
+ }
+ createAdapter()
+ }
+
+ private fun createAdapter() {
+ adapter = ArrayAdapter(this, R.layout.dropdown_item, microphones.map { it.productName.toString() } )
+ adapter!!.setDropDownViewResource(R.layout.dropdown_item)
+ spinner.adapter = adapter
+ }
+
+ override fun onStart() {
+ super.onStart()
+ bindService(Intent(this, radarApp.radarService), radarServiceConnection, 0)
+ binding.refreshButton.setOnClickListener{
+ refreshInputDevices()
+ }
+ }
+
+ private fun workOnStateElseShowToast(work: PhoneAudioInputState.() -> Unit) {
+ if (state != null && state?.status == SourceStatusListener.Status.CONNECTED) {
+ state?.apply(work) ?: return
+ } else {
+ Boast.makeText(this, R.string.unable_to_record_toast, Toast.LENGTH_SHORT).show(true)
+ }
+ }
+
+ private fun disableButtonsInitially() {
+ binding.apply {
+ btnStartRec.setVisibleAndDisabled()
+ btnStopRec.setInvisibleAndDisabled()
+ btnPauseRec.setInvisibleAndDisabled()
+ btnResumeRec.setInvisibleAndDisabled()
+ }
+ }
+
+ private fun onRecordingViewUpdate() {
+ binding.apply {
+ btnStartRec.setInvisibleAndDisabled()
+ btnStopRec.setVisibleAndEnabled()
+ }
+ }
+
+ private fun notRecordingViewUpdate() {
+ binding.apply {
+ btnStartRec.setVisibleAndEnabled()
+ btnStopRec.setInvisibleAndDisabled()
+ }
+ }
+
+ private fun onPauseViewUpdate() {
+ binding.apply {
+ btnStartRec.setInvisibleAndDisabled()
+ btnStopRec.setVisibleAndDisabled()
+ btnPauseRec.setInvisibleAndDisabled()
+ btnResumeRec.setVisibleAndEnabled()
+ }
+ }
+
+ private fun onResumeViewUpdate() {
+ binding.apply {
+ btnStartRec.setInvisibleAndDisabled()
+ btnStopRec.setVisibleAndEnabled()
+ btnPauseRec.setVisibleAndEnabled()
+ btnResumeRec.setInvisibleAndDisabled()
+ }
+ }
+
+ override fun onResume() {
+ super.onResume()
+ binding.btnStartRec.setOnClickListener {
+ workOnStateElseShowToast {
+ logger.debug("Starting Recording")
+ refreshInputDevices()
+ audioRecordManager?.startRecording()
+ binding.btnPauseRec.setVisibleAndEnabled()
+ binding.btnResumeRec.setInvisibleAndDisabled()
+ }
+ }
+
+ binding.btnStopRec.setOnClickListener {
+ workOnStateElseShowToast {
+ logger.debug("Stopping Recording")
+ audioRecordManager?.stopRecording()
+ binding.btnPauseRec.setInvisibleAndDisabled()
+ binding.btnResumeRec.setInvisibleAndDisabled()
+ proceedAfterRecording()
+ }
+ }
+
+ binding.btnPauseRec.setOnClickListener {
+ onPauseViewUpdate()
+ audioInputViewModel?.pauseTimer()
+ state?.let { it.audioRecordManager?.apply { pauseRecording() } }
+ }
+
+ binding.btnResumeRec.setOnClickListener {
+ onResumeViewUpdate()
+ audioInputViewModel?.resumeTimer()
+ state?.let { it.audioRecordManager?.apply { resumeRecording() } }
+ }
+ }
+
+ private fun refreshInputDevices() {
+ state ?: return
+ val connectedMicrophones: List = AudioDeviceUtils.getConnectedMicrophones(this@PhoneAudioInputActivity)
+ state?.let { state ->
+ state.connectedMicrophones.postValue(connectedMicrophones)
+ microphones.clear()
+ microphones.addAll(connectedMicrophones)
+ logger.trace("Modifying dropdown: microphones: {}", microphones.map { it.productName.toString() })
+ if (state.microphonePrioritized && state.finalizedMicrophone.value !in connectedMicrophones) {
+ state.microphonePrioritized = false
+ logger.trace("Activity: Microphone prioritized?: false")
+ }
+ runOnUiThread {
+ createAdapter()
+ try {
+ adapter?.getPosition(previousDevice)?.also {
+ previousDevice?.setPreferredMicrophone(it)
+ }
+ } catch (ex: Exception) {
+ logger.warn("Cannot select last preferred microphone")
+ }
+ if (state.microphonePrioritized) {
+ val microphone = adapter?.getPosition(state.finalizedMicrophone.value?.productName.toString())
+ microphone ?: return@runOnUiThread
+ spinner.setSelection(microphone)
+ }
+ }
+ }
+ }
+
+ private fun proceedAfterRecording() {
+ AudioDeviceUtils.showAlertDialog (this) {
+ setTitle(getString(R.string.proceed_title))
+ .setMessage(getString(R.string.proceed_message))
+ .setPositiveButton(getString(R.string.send)) { dialog: DialogInterface, _: Int ->
+ dialog.dismiss()
+ state?.audioRecordingManager?.send()
+ logger.debug("Sending the data")
+ }.setNeutralButton(getString(R.string.play)) { dialog: DialogInterface, _: Int ->
+ dialog.dismiss()
+ logger.debug("Playing the audio")
+ startAudioPlaybackFragment()
+ }.setNegativeButton(getString(R.string.discard)) { dialog: DialogInterface, _: Int ->
+ logger.debug("Discarding the last recorded file")
+ dialog.cancel()
+ state?.isRecordingPlayed = false
+ clearLastRecordedFile()
+ state?.audioRecordManager?.clear()
+ }
+ }
+ }
+
+ private fun clearLastRecordedFile() {
+ preferences?.apply {
+ edit()
+ .putString(LAST_RECORDED_AUDIO_FILE, null)
+ .apply()
+ }
+ }
+
+ private fun startAudioPlaybackFragment() {
+ logger.info("Starting audio playback fragment")
+ if (lastRecordedAudioFile == null) {
+ logger.error("Last recorded audio file lost!! Not Starting playback fragment")
+ return
+ }
+ try {
+ val fragment = PhoneAudioInputPlaybackFragment.newInstance(lastRecordedAudioFile!!)
+ createPlaybackFragmentLayout(R.id.phone_audio_playback_fragment, fragment)
+ } catch (ex: IllegalStateException) {
+ logger.error("Failed to start audio playback fragment: is PhoneAudioInputActivity already closed?", ex)
+ }
+ }
+
+ private fun createPlaybackFragmentLayout(id: Int, fragment: Fragment) {
+ setContentView(FrameLayout(this).apply {
+ this.id= id
+ layoutParams =ViewGroup.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT)
+ })
+ supportFragmentManager.commit {
+ add(id, fragment)
+ }
+ }
+
+ override fun onStop() {
+ super.onStop()
+ state?.apply{
+ isRecording.value?.let {
+ audioRecordManager?.stopRecording()
+ }
+ microphonePrioritized = false
+ isRecording.postValue(null)
+ }
+ unbindService(radarServiceConnection)
+ }
+
+ private fun removeVMCallbacks() {
+ viewModelInitializer?.toRunnable()?.let(mainHandler::removeCallbacks)
+ addStateToVM?.toRunnable()?.let(mainHandler::removeCallbacks)
+ postNullState?.toRunnable()?.let(mainHandler::removeCallbacks)
+ }
+
+ private fun (() -> Unit).toRunnable(): Runnable = Runnable(this)
+
+ override fun onDestroy() {
+ removeVMCallbacks()
+ super.onDestroy()
+ }
+
+ companion object {
+ private val logger: Logger = LoggerFactory.getLogger(PhoneAudioInputActivity::class.java)
+
+ const val AUDIO_FILE_NAME = "phone-audio-playback-audio-file-name"
+ const val EXTERNAL_DEVICE_NAME = "EXTERNAL-DEVICE-NAME"
+
+ private fun View.setVisibleAndEnabled() {
+ visibility = View.VISIBLE
+ isEnabled = true
+ }
+
+ private fun View.setVisibleAndDisabled() {
+ visibility = View.VISIBLE
+ isEnabled = false
+ }
+
+ private fun View.setInvisibleAndDisabled() {
+ visibility = View.INVISIBLE
+ isEnabled = false
+ }
+ }
+}
\ No newline at end of file
diff --git a/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/ui/PhoneAudioInputPlaybackFragment.kt b/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/ui/PhoneAudioInputPlaybackFragment.kt
new file mode 100644
index 000000000..39e32287f
--- /dev/null
+++ b/plugins/radar-android-phone-audio-input/src/main/java/org/radarbase/passive/phone/audio/input/ui/PhoneAudioInputPlaybackFragment.kt
@@ -0,0 +1,313 @@
+/*
+ * Copyright 2017 The Hyve
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.radarbase.passive.phone.audio.input.ui
+
+import android.content.DialogInterface
+import android.content.Intent
+import android.media.AudioDeviceInfo
+import android.media.MediaPlayer
+import android.os.Bundle
+import android.os.Handler
+import android.os.Looper
+import android.os.Process
+import android.view.LayoutInflater
+import android.view.View
+import android.view.ViewGroup
+import android.view.WindowManager
+import android.widget.Button
+import android.widget.SeekBar
+import android.widget.SeekBar.OnSeekBarChangeListener
+import android.widget.Toast
+import androidx.activity.OnBackPressedCallback
+import androidx.fragment.app.Fragment
+import androidx.fragment.app.activityViewModels
+import org.radarbase.android.util.Boast
+import org.radarbase.android.util.SafeHandler
+import org.radarbase.passive.phone.audio.input.PhoneAudioInputState
+import org.radarbase.passive.phone.audio.input.R
+import org.radarbase.passive.phone.audio.input.ui.PhoneAudioInputActivity.Companion.AUDIO_FILE_NAME
+import org.radarbase.passive.phone.audio.input.ui.PhoneAudioInputActivity.Companion.EXTERNAL_DEVICE_NAME
+import org.radarbase.passive.phone.audio.input.databinding.FragmentAudioInputPlaybackBinding
+import org.radarbase.passive.phone.audio.input.utils.AudioDeviceUtils
+import org.slf4j.Logger
+import org.slf4j.LoggerFactory
+import java.io.File
+import java.io.IOException
+
+
+class PhoneAudioInputPlaybackFragment : Fragment() {
+
+ private var binding: FragmentAudioInputPlaybackBinding? = null
+ private lateinit var playbackButtons: List