Output of data received from an Android device such as sound

The goal:I’ve got a small android app running, which accepts input via the mic and streams it back in realtime – I’m trying to modify it so that it streams the data over to a java server on my PC.

The problem: All I get is a “tappy” noise over and over again, or a continous “beep” sound (The two noises change depending on how I’m reading the data serverside).
There is no sound coming from the PC. I realize it’s most likely something to do with the way I’m playing back the sound, since on the android device it’s able to play its own sound accurately.

The code:On both the client and server side there is only one class.

How the server is receiving and trying to play the data:

public static void main(String[] args) {
    try {
        ServerSocket x = new ServerSocket(6790);
        try {
            while (true) {

                Socket clientSocket = x.accept();

                System.out.println("Connected" + clientSocket.getInetAddress());
                InputStream y = clientSocket.getInputStream();
                // ObjectInputStream in = new ObjectInputStream(y);

                AudioFormat format = new AudioFormat(8000, 16, 1, true, true);

                AudioInputStream audIn = new AudioInputStream(y, format, 1);
                SourceDataLine speakers;
                try {
                    byte[] data = new byte[32];
                    int numBytesRead;
                    int CHUNK_SIZE = 1024;
                    int bytesRead = 0;
                    DataLine.Info dataLineInfo = new DataLine.Info(SourceDataLine.class, format);
                    speakers = (SourceDataLine) AudioSystem.getLine(dataLineInfo);
                    while (bytesRead  0) {
                            speakers.write(data, 0, 16);
                            System.out.println("writing data");
                } catch (LineUnavailableException e) {
        } catch (Exception e1) {
    } catch (Exception e) {

How I’m recording and sending from the client side:

public void run() {
    try {
        Log.i("Audio", "Running Audio Thread");
        AudioRecord recorder = null;
        AudioTrack track = null;
        ObjectOutputStream out = new ObjectOutputStream(x.getOutputStream());
        short[][] buffers = new short[256][160];
        int ix = 0;

         * Initialize buffer to hold continuously recorded audio data, start recording, and start
         * playback.

        int N = AudioRecord.getMinBufferSize(8000, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
        recorder = new AudioRecord(AudioSource.MIC, 8000, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, N * 1);
        track = new AudioTrack(AudioManager.STREAM_MUSIC, 8000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, N * 1,

         * Loops until something outside of this thread stops it.
         * Reads the data from the recorder and writes it to the audio track for playback.
        while (!stopped) {
            Log.i("Map", "Writing new data to buffer");
            short[] buffer = buffers[ix++ % buffers.length];
            N = recorder.read(buffer, 0, buffer.length);

            // ensure endianess
            ByteBuffer bb = ByteBuffer.allocate(buffer.length * 2);
            byte[] byteBuffer = bb.array();

            track.write(buffer, 0, buffer.length);


        * Frees the thread's resources after the loop completes so that it can be run again
        Log.d("Exit", "Exiting thread");
    } catch (Throwable x) {
        Log.w("Audio", "Error reading voice audio", x);


I know full well that UDP is usually what’s used for voice streaming, but this is the only set of code that even comes close to working. In previous versions, I’ve tried using things like RTP and WebRTC, as well as multiple libraries, this is the only version where I’ve been able to get a “live” playback.

I’ve already tried a ton of different approaches, such as using Clip and ObjectInputStream, but nothing seems to be working. In this version, I am sending
via ObjectOutputStream

via OutputStream, and reading via AudioInputStream, but I’m open to sending and reading any kind of data as long as it works. I’ve tried converting the short[]
to a byte[]
and playing it back, but it didn’t seem to help.


No actual errors are to be reported (as in no exceptions were thrown). There is also nothing wrong in the LogCat.

(This is no longer true; see edit for more details)

Note that the only working configurations for format are:

AudioFormat format = new AudioFormat(8000, 16, 1, true, true);  //louder
AudioFormat format = new AudioFormat(8000, 16, 1, true, false); //quieter

The other 2 combinations of signed/unsigned + endian throw exceptions.

I can also confirm that the data is actually being sent through the connection – I used wireshark to tap into it and I can see clearly the source and destination, and data being sent. I’m sending to the correct port for sure.

If any more code is needed, ask and I will edit and put it up.

TLDR; How can I play back sound sent from an android device to a computer running a java server?

EDIT:Upon further investigation and some modification of the code, the sound is still not playing, but I have identified some new problems. Logcat is now throwing an exception, and the server is constantly reading in the End Of Stream signal (-1). I confirmed it by printing it to the screen.

LogCat Error:

03-04 10:28:47.496: W/Audio(794): Error reading voice audio
03-04 10:28:47.496: W/Audio(794): java.net.SocketException: Socket closed
03-04 10:28:47.496: W/Audio(794):   at libcore.io.Posix.sendtoBytes(Native Method)
03-04 10:28:47.496: W/Audio(794):   at libcore.io.Posix.sendto(Posix.java:156)
03-04 10:28:47.496: W/Audio(794):   at libcore.io.BlockGuardOs.sendto(BlockGuardOs.java:177)
03-04 10:28:47.496: W/Audio(794):   at libcore.io.IoBridge.sendto(IoBridge.java:466)
03-04 10:28:47.496: W/Audio(794):   at java.net.PlainSocketImpl.write(PlainSocketImpl.java:508)
03-04 10:28:47.496: W/Audio(794):   at java.net.PlainSocketImpl.access$100(PlainSocketImpl.java:46)
03-04 10:28:47.496: W/Audio(794):   at java.net.PlainSocketImpl$PlainSocketOutputStream.write(PlainSocketImpl.java:270)
03-04 10:28:47.496: W/Audio(794):   at java.io.OutputStream.write(OutputStream.java:82)
03-04 10:28:47.496: W/Audio(794):   at com.javaorigin.rtpsender.MainActivity$Audio.run(MainActivity.java:126)

As far as I can tell, I’m not closing any streams related to the socket, or the socket itself (both server and clientside). So what could be causing this exception? This exception is only thrown when I “stop” the recording – during recording from the android device, playback works fine. If I remove out.write()
, I’m back to receiving full loop functionality (as in, I can speak into the device and the DEVICE will play it back). It also allows me to “start” or “stop” multiple times in the same session. No exceptions will be thrown. When putting out.write()
back into the code, I am unable to start/stop multiple sessions. Instead, after the initial session, the exception is thrown, and the next time I try to “start” the recording, the app crashes.

At a glance, although I have not examined your code closely or tried your code for myself, I can see you are writing your array with ObjectOutputStream.writeObject()
, but reading it as raw bytes.

This will most certainly cause problems. ObjectOutputStream.writeObject()
is for serializing Java objects; it writes more than just raw data, even for an array. It writes the object class, signature, and a few other things so that it can be deserialized by an ObjectInputStream
on the other end (specification of this format here
if you are curious).

You have many options. The ones I can think of are:

  • Don’t use ObjectOutputStream.writeObject()
    . Instead write the raw data directly to the socket (with correct endian-ness and such to match your frame format). You could, for example, buffer your data in a ByteBuffer
    via ByteBuffer#asShortBuffer()
    (make sure to set endian-ness to match), then write the corresponding byte array to the socket’s output stream. This would be the approach with the lowest overhead.

  • Create some Serializable
    class that holds your short buffer as a member. Write it with an ObjectOutputStream
    and read it with an ObjectInputStream
    on the other end (you’ll need an identical class in both client and server). It won’t directly be readable with an AudioInputStream
    like the previous option, though.

You can probably think of other options as well. If ByteBuffer
is too hard to work into your code then you can write each short individually as a pair of bytes; same end effect (note that ShortBuffer.array()
will let you access the buffer as a short[]
which may help make integration easier).

Hello, buddy!责编内容来自:Hello, buddy! (源链) | 更多关于

本站遵循[CC BY-NC-SA 4.0]。如您有版权、意见投诉等问题,请通过eMail联系我们处理。
酷辣虫 » 移动开发 » Output of data received from an Android device such as sound

喜欢 (0)or分享给?

专业 x 专注 x 聚合 x 分享 CC BY-NC-SA 4.0

使用声明 | 英豪名录