Problem I have a React Native Expo application where I successfully call my Node.js API using my local IP. The API works both in the emulator and on my physical Android device. However, when I try to call my Flask API, I get a Network Request Failed error.
I am running my Flask app on my local machine (http://192.168.x.x:5000), and my physical Android device is connected to the same WiFi network.
Flask API (Python) Here’s my Flask app, which is a simple speech transcription API. It receives an audio file in base64 format, decodes it, and transcribes it using speech_recognition.
from flask import Flask, request, jsonify
import base64
import tempfile
import speech_recognition as sr
from pydub import AudioSegment
from io import BytesIO
from flask_cors import CORS
import logging
app = Flask(__name__)
CORS(app, resources={
r"/*": {
"origins": "*",
"methods": ["GET", "POST", "OPTIONS"],
"allow_headers": ["Content-Type", "uthorization"]
}
})
logging.basicConfig(level=logging.DEBUG)
def transcribe_audio(audio_base64):
# Decode the base64 audio data
try:
audio_data = base64.b64decode(audio_base64)
except Exception as e:
return f"Error decoding base64 audio: {str(e)}"
# Convert base64 audio to a format that SpeechRecognition can use (WAV)
audio_file = BytesIO(audio_data)
audio = AudioSegment.from_file(audio_file)
# Create a temporary file to store the WAV audio
with tempfile.NamedTemporaryFile(delete=False, suffix=".wav") as temp_file:
audio.export(temp_file.name, format="wav")
temp_file_path = temp_file.name
# Use SpeechRecognition to transcribe the audio
recognizer = sr.Recognizer()
try:
with sr.AudioFile(temp_file_path) as source:
audio_listened = recognizer.record(source)
transcription = recognizer.recognize_google(audio_listened)
return transcription
except Exception as e:
return f"Error transcribing audio: {str(e)}"
@app.route('/health',methods=['GET'])
def health():
logging.info("Health check endpoint called")
return "Hello world from python"
@app.route('/transcribe', methods=['POST'])
def transcribe():
# Get the base64 audio data from the POST request
print("here")
data = request.json
print(data)
if not data or 'audio_base64' not in data:
return jsonify({"error": "Missing audio_base64 in request"}), 400
audio_base64 = data['audio_base64']
# Transcribe the audio
transcription = transcribe_audio(audio_base64)
if "Error" in transcription:
return jsonify({"error": transcription}), 500
return jsonify({"transcription": transcription}), 200
if __name__ == '__main__':
app.run(host='0.0.0.0', port=5000, debug=True,threaded=True)
React Native Expo App My React Native function calls the Flask API. I record audio using expo-av, convert it to base64, and send it to Flask for transcription.
import { Audio } from "expo-av";
import { MutableRefObject } from "react";
import * as Filesystem from "expo-file-system";
import { Platform } from "react-native";
import * as Device from "expo-device";
import axios from "axios"
export const transcribeSpeechAssembly = async (
audioRecordingRef: MutableRefObject<Audio.Recording>
) => {
const isPrepared = audioRecordingRef?.current?._canRecord;
if (!isPrepared) {
console.error("Recording must be prepared first");
return undefined;
}
try {
await audioRecordingRef?.current?.stopAndUnloadAsync();
const recordingUri = audioRecordingRef?.current?.getURI() || "";
const baseUri = await Filesystem.readAsStringAsync(recordingUri, {
encoding: Filesystem.EncodingType.Base64
});
const rootOrigin =
Platform.OS === "android"
? "My local IP"
: Device.isDevice
? process.env.LOCAL_DEV_IP || "localhost"
: "localhost";
const serverUrl = `http://${rootOrigin}:5000`;
if (recordingUri && baseUri) {
console.log("url",`${serverUrl}/transcribe`)
const api = axios.create({
baseURL: serverUrl,
timeout: 10000,
headers: {
'Content-Type': 'application/json',
'Accept': 'application/json'
}
});
try {
const healthCheck = await api.get('/health');
console.log("Health check response:", healthCheck.data);
const transcriptionResponse = await api.post('/transcribe', {
audio_base64: baseUri
});
console.log("Transcription response:", transcriptionResponse.data);
return transcriptionResponse.data?.transcription;
} catch (error) {
console.error("error from python server",error)
}
} else {
console.error("Something went wrong with recording");
return undefined;
}
} catch (error) {
console.error("Error in transcription process:", error);
return undefined;
}
};
What I Have Tried Confirmed Flask API is Running:
I checked http://127.0.0.1:5000/health and http://192.168.x.x:5000/health in Postman and my browser. Both return "Hello world from python". Checked Expo Network Requests:
My Node.js API works fine with http://192.168.x.x:3000. When I call Flask (http://192.168.x.x:5000/transcribe), I get "Network Request Failed". Allowed Flask to Accept Connections:
app.run(host='0.0.0.0', port=5000, debug=True, threaded=True) ensures Flask is accessible from other devices. Checked CORS Issues:
Used flask_cors to allow all origins. Verified Android Permissions:
AndroidManifest.xml includes:
<uses-permission android:name="android.permission.INTERNET" />
adb reverse tcp:5000 tcp:5000 doesn't help since it's a physical device. Disabled Firewall / Antivirus:
No improvement. Checked API Calls in Chrome Debugger:
fetch calls fail with "Network Request Failed".
Problem I have a React Native Expo application where I successfully call my Node.js API using my local IP. The API works both in the emulator and on my physical Android device. However, when I try to call my Flask API, I get a Network Request Failed error.
I am running my Flask app on my local machine (http://192.168.x.x:5000), and my physical Android device is connected to the same WiFi network.
Flask API (Python) Here’s my Flask app, which is a simple speech transcription API. It receives an audio file in base64 format, decodes it, and transcribes it using speech_recognition.
from flask import Flask, request, jsonify
import base64
import tempfile
import speech_recognition as sr
from pydub import AudioSegment
from io import BytesIO
from flask_cors import CORS
import logging
app = Flask(__name__)
CORS(app, resources={
r"/*": {
"origins": "*",
"methods": ["GET", "POST", "OPTIONS"],
"allow_headers": ["Content-Type", "uthorization"]
}
})
logging.basicConfig(level=logging.DEBUG)
def transcribe_audio(audio_base64):
# Decode the base64 audio data
try:
audio_data = base64.b64decode(audio_base64)
except Exception as e:
return f"Error decoding base64 audio: {str(e)}"
# Convert base64 audio to a format that SpeechRecognition can use (WAV)
audio_file = BytesIO(audio_data)
audio = AudioSegment.from_file(audio_file)
# Create a temporary file to store the WAV audio
with tempfile.NamedTemporaryFile(delete=False, suffix=".wav") as temp_file:
audio.export(temp_file.name, format="wav")
temp_file_path = temp_file.name
# Use SpeechRecognition to transcribe the audio
recognizer = sr.Recognizer()
try:
with sr.AudioFile(temp_file_path) as source:
audio_listened = recognizer.record(source)
transcription = recognizer.recognize_google(audio_listened)
return transcription
except Exception as e:
return f"Error transcribing audio: {str(e)}"
@app.route('/health',methods=['GET'])
def health():
logging.info("Health check endpoint called")
return "Hello world from python"
@app.route('/transcribe', methods=['POST'])
def transcribe():
# Get the base64 audio data from the POST request
print("here")
data = request.json
print(data)
if not data or 'audio_base64' not in data:
return jsonify({"error": "Missing audio_base64 in request"}), 400
audio_base64 = data['audio_base64']
# Transcribe the audio
transcription = transcribe_audio(audio_base64)
if "Error" in transcription:
return jsonify({"error": transcription}), 500
return jsonify({"transcription": transcription}), 200
if __name__ == '__main__':
app.run(host='0.0.0.0', port=5000, debug=True,threaded=True)
React Native Expo App My React Native function calls the Flask API. I record audio using expo-av, convert it to base64, and send it to Flask for transcription.
import { Audio } from "expo-av";
import { MutableRefObject } from "react";
import * as Filesystem from "expo-file-system";
import { Platform } from "react-native";
import * as Device from "expo-device";
import axios from "axios"
export const transcribeSpeechAssembly = async (
audioRecordingRef: MutableRefObject<Audio.Recording>
) => {
const isPrepared = audioRecordingRef?.current?._canRecord;
if (!isPrepared) {
console.error("Recording must be prepared first");
return undefined;
}
try {
await audioRecordingRef?.current?.stopAndUnloadAsync();
const recordingUri = audioRecordingRef?.current?.getURI() || "";
const baseUri = await Filesystem.readAsStringAsync(recordingUri, {
encoding: Filesystem.EncodingType.Base64
});
const rootOrigin =
Platform.OS === "android"
? "My local IP"
: Device.isDevice
? process.env.LOCAL_DEV_IP || "localhost"
: "localhost";
const serverUrl = `http://${rootOrigin}:5000`;
if (recordingUri && baseUri) {
console.log("url",`${serverUrl}/transcribe`)
const api = axios.create({
baseURL: serverUrl,
timeout: 10000,
headers: {
'Content-Type': 'application/json',
'Accept': 'application/json'
}
});
try {
const healthCheck = await api.get('/health');
console.log("Health check response:", healthCheck.data);
const transcriptionResponse = await api.post('/transcribe', {
audio_base64: baseUri
});
console.log("Transcription response:", transcriptionResponse.data);
return transcriptionResponse.data?.transcription;
} catch (error) {
console.error("error from python server",error)
}
} else {
console.error("Something went wrong with recording");
return undefined;
}
} catch (error) {
console.error("Error in transcription process:", error);
return undefined;
}
};
What I Have Tried Confirmed Flask API is Running:
I checked http://127.0.0.1:5000/health and http://192.168.x.x:5000/health in Postman and my browser. Both return "Hello world from python". Checked Expo Network Requests:
My Node.js API works fine with http://192.168.x.x:3000. When I call Flask (http://192.168.x.x:5000/transcribe), I get "Network Request Failed". Allowed Flask to Accept Connections:
app.run(host='0.0.0.0', port=5000, debug=True, threaded=True) ensures Flask is accessible from other devices. Checked CORS Issues:
Used flask_cors to allow all origins. Verified Android Permissions:
AndroidManifest.xml includes:
<uses-permission android:name="android.permission.INTERNET" />
adb reverse tcp:5000 tcp:5000 doesn't help since it's a physical device. Disabled Firewall / Antivirus:
No improvement. Checked API Calls in Chrome Debugger:
fetch calls fail with "Network Request Failed".
Share Improve this question asked 17 hours ago Nisarg jadhavNisarg jadhav 111 silver badge3 bronze badges1 Answer
Reset to default 0Starting with Android 9 (API level 28), cleartext support is disabled by default. You need to establish connection via https, otherwise you need to allow your app to make unsecure requests - add the following property in your Android app manifest: android:usesCleartextTraffic="true"