Raspberry pi: Video and Audio recording and streaming guide

I’ve spent some time on setting up audio and video streaming on my raspberry pi (mostly used as a baby monitor right now). While there are great ressources out there, it took me a lot of effort to find them and put everything together. So here’s a handy list of instructions if you want to:

  • Record and stream audio
  • Record and stream both audio and video into one file
  • Are just looking for an introduction into the topic.

Limitations

There are a few things I need to put up front here:

I use a Raspberry Pi 2 with Raspbian on it, with the PiNoir camera module and a cheap USB microphone connected to it. The same instructions probably apply for all Pi models. If you use a different OS than Raspbian, you’re probably far more advanced than I am and know how to adopt this stuff yourself. If you use a normal webcam instead of the Pi camera modules, most of my ressources will not be helpful either.

I only cover streaming on a local network and will not adress streaming over the internet. Most importantly, I’m an absolute beginner regarding Pi and linux things and can not give any security-related advice (e.g. how to setup an encrypted internet stream). So if you need these instructions and do anything security-related with them (like adding internet access to potentially private video streams), you’re completely insane.

Ressources for the Raspberry Pi video camera

These are easy to find and I’ll just put them here for reference:

I’ve mostly been using the excellent adafruit tutorials, most importantly to set up SSH access. Additionally, there is a good guide here on how to set up a static IP adress (also check out the other tutorials at circuitbasics). The official pi website also has good tutorials.

For the camera module, you should be fine with the official raspberry pi documentation  (also on how to connect and enable it and a very basic getting started for the python module). Basically, I use these commands for recording images and video:

image without preview:
raspistill -o image.jpg -n
video in 720p with 2Mbit/s (hit STRG+c to stop recording):
raspivid -o video.h264 -t 0 -b 2000000 -w 1280 -h 720

To access the files on my Pi, I use SFTP with FileZilla.

This should set you up for everything more advanced. Additionally, use the “top” command when connected with SSH to see all running processes (use “k” to kill a process and STRG+c to exit top) and append ” &” to a command to start it in the background (so you dont have to open several SSH windows at once to start several processes).

Streaming video

For streaming video, there are basically the possibilities of using GStreamer, VLC, ffmpeg, netcat and mjpeg.

There is a very good blogpost here that describes streaming video and audio with GStreamer (install it withsudo apt-get install gstreamer1.0) and also covers the setup for an usb microphone. For me, the stream worked very well and without any noticable latency (i’d say below 100ms). Downsides are mainly that GStreamer is required on the receiving end – with OSX and Linux this works perfectly fine, for Windows this tutorial here may help (although I haven’t tried it). If you want to connect with multiple devices including Smartphones, this will probably not be your preferred option, but if you have one fixed machine as receiver, this should work fine.

For the other options, there is a decent article here describing each approach. It ultimately recommends using mjpeg. This appears to be the best solution for web-based access with smartphones and also seems to have low latency (I haven’t tried it) but probably won’t work well combined with audio or with high-resolution streams.

I’ve tried GStreamer (see above), VLC and Netcat. VLC produced mediocre stream quality with a lot of lag, Netcat was very simple to set up but wouldn’t work on Android since you have to add “:demux h264” on the receiving end as an option. If you want to try it, use these commands:

On the pi:
raspivid -t 0 -o - | nc -l -p 2222
(-p: choose any port higher than 1023)

In VLC: open network stream:
tcp://192.168.1.20:2222
add in options: :demux=h264

I haven’t tried FFmpeg since I didn’t feel like compiling it myself (~2h duration) and there appears to be no clear benefit. Mainly I wanted to have both video and audio in my stream – which brings us first to network protocols and then to picam.

Network protocols and streaming audio

Again, I’m not an expert in any of this, but it’s particularly bad for network protocols. So read ahead with care.

It appears there are two protocols for sending data over a network: TCP and UDP – see here, here or here for an explanation. Basically, TCP checks that no packets are lost (slow and reliable) while UDP just pushes packets over the network without any checks (fast but unreliable). So for videostreaming UDP is mostly the better option (potentially less lag) but TCP is used for http, appears to be easier to set up and is supported by more devices (e.g. every standard webbrowser). The simple streaming example with netcat above uses TCP.

Streams with RTP (real-time transmission protocoll) usually use UDP. Since it only contains the stream itself, it is often used together with RTSP (real-time streaming protocoll) – we’ll get into that below.

Then there is the question of where to send your stream to. You can either send your stream directly to a certain IP (which means you need to tell your pi where the stream is headed to) or send it to a multicast adress in the range 224.x.x.x – 239.x.x.x  that makes it available on the complete network (and, depending on your router configuration, can be forwarded to other LANs that are connected). Or you can set up an rtsp server that will process incoming requests for streams. To give you an idea for volume considerations: raspicam has a standard bitrate of 5 Mbit/s, a standard wifi card with 802.11g allows up to 54 Mbit/s. The recommendation I have seen is that for a up to ~3 users and 3-4 seconds lag, you should be fine with a TCP/http stream, for more receiving devices or less lag, you should use UDP / RTP.

Below, I’ll show the different possibilities with VLC and an audio stream from the microphone. This can also be adopted for a video stream (see the links above). Setting up the usb microphone is well described in the article already mentioned above.

Audio stream examples with VLC:

TCP/http, open http://<PI_ip_adress>:8000 in VLC to recieve:
cvlc -vvv alsa://hw:1,0 --sout \
 '#transcode{acodec=mp3,ab=128}:standard{access=http,mux=ts,dst=:8000}'
RTP with multicast, open rtp://239.255.1.1 in VLC to recieve:
cvlc -vvv alsa://hw:1,0 --sout \
 '#transcode{acodec=mp3,ab=128}:rtp{mux=ts,dst=239.255.1.1,sdp=sap}'

Both streams have a delay of ~1s and I’m a bit clueless on how to further reduce it.

Also, when distribuiting audio, there are other possibilities, including a direct stream over SSH or using FFmpeg and on the more advanced side OpenOB (possibly) and icecast.

Combine video and audio with picam

Now that we have set up both video and audio recording – how can we combine them in one file? It turns out this is not as easy as it should be. The raspivid-command clearly does not accept audio input itself, so we need some other program to combine both.

This should be possible with vlc – but with a lot of searching, I’ve only found ressources of people trying and not really getting it to work. If you want to stream both to a different device, you may set up separate streams for video and audio – instructions are here for VLC and also here for FFmpeg and icecast (and another FFmpeg post with few details here)

After spending a lot of time on optimizing VLC, I’ve found the program picam that does exactly what I’ve been looking for. So if you want to either record or stream both video and audio on your pi, I’d highly recommend picam. The instructions for installing picam are very clear and have worked for me without problems.

Picam will be started with (assuming the USB microphone is on hw:1,0):

cd ~/picam
./picam --alsadev hw:1,0 &

To start/stop recording use:

touch hooks/start_record
touch hooks/stop_record

Here are a few examples for setting up picam for different recording types:

Record with 90fps:
./picam --alsadev hw:1,0 -h 480 -w 640 -f 90 &
Record full HD:
./picam --alsadev hw:1,0 -w 1920 -h 1080 -f 30 &

Setting up a network stream with picam and node-rtsp-rtmp-server:

Picam has support for http streaming built into it and the instructions page also adds details to passing the stream to different programs. The easiest option is to use the node-rtsp-rtmp-server written by the same author which works very well. While Node.js is already included in Raspbian, I needed to install the npm and coffee tools manually:

sudo apt-get install npm
sudo npm install coffee-script -g

With this, I was able to install the node-rtsp-rtmp-server as documented:

git clone https://github.com/iizukanao/node-rtsp-rtmp-server.git
cd node-rtsp-rtmp-server
npm install -d

To set things up, first start the server and then run picam with the option -rtspout:

cd ~/node-rtsp-rtmp-server
./start_server.sh &
# and once the server is started (you'll see some output in the console)
cd ~/picam
./picam --alsadev hw:1,0 --rtspout -w 800 -h 480 -v 500000 -f 20 &

To access the stream in VLC, open a network stream at rtsp://<pi_ip_adress>:80/live/picam. This also works on my wife’s Galaxy S5 phone in VLC, but not on my (very old and slow) Android 4.0 tablet.

The rtsp server appears to take about 30 seconds to start, so bootup time is not optimal, but currently I think this is sufficient. Picam also appears to work well with other options like nginx – try on your own if required.

Automatically starting a stream upon boot

There appear to be several possibilities to automatically start a program after a reboot, documented here. I use cron to start a script called autostart.sh that itself starts either audiostream.sh or videostream.sh (I haven’t figured out a way yet to start both at the same time but toggle the # depending on which one I currently do not need).

run “crontab -e” and add:
@reboot /home/pi/autostart.sh

These are the scripts I use (remember to make them executable with “chmod +x <scriptname>.sh”):

autostart.sh:
#!/bin/bash 
#/home/pi/audiostream.sh > log_autostart.txt
/home/pi/videostream.sh > log_autostart.txt
audiostream.sh:
sleep 5
cvlc -vvv alsa://hw:1,0 --sout \
'#transcode{acodec=mp3,ab=128}:standard{access=http,mux=ts,dst=:8000}' &
videostream.sh:
sleep 1
cd /home/pi/node-rtsp-rtmp-server
./start_server.sh &
sleep 40
cd /home/pi/picam
./picam --alsadev hw:1,0 --rtspout -w 800 -h 480 -v 500000 -f 20 &

Sidenotes: It appears to be helpful to add a short delay after boot to make sure everything is set up. Also, the home directory seems to be not yet available so I use /home/pi instead of ~. In any case, the “> log_autostart.txt” part in the first script leads to all output being written to the text file to give an indication on what went possibly wrong.

Optimizing latency and recording streams in VLC

When I first ran the code given above, I still had ~2 seconds of latency in my stream. Since the rtsp-solution used above is supposed to be much faster, I had a look at VLC on the receiving end and found a simple improvement: At Tools > Preferences > input/codecs there’s a network option where you can set the caching from standard to lowest latency. With this, I’m down to roughly 300-500 ms of latency in my stream.

Also, VLC offers the possibility to record your current stream. You can either find it in the menu or add it to the playback buttons. In my case, the video is recorded to my local videos folder.

Motion detection and other ressources

That’s it, we’re at the end of the now very long blog post. Since I’ve also had a look at motion detection (regarding the possibility to use the Pi as a simple home alarm system) and this relates to the video streaming topic, I’ll also put links in here as well.

Regarding motion detection, I’ve seen two options: Python and motion.

Motion appears to be quite good at this and offers all kinds of features, with details for example here and here. Originally, the downside was that while motion works well with usb webcams, it did not with the Pi Cam / Pi NoIR modules unless you install a special version. This is all information from ~2013 and it appears now there’s a ready-made image called motionPie that shortcuts a lot and also an easier workaround using the uv4l software kit.

If you want to do this in Phython, there are other options. Here is a detailed explanation using the RPi-Cam-Web-Interface (the first time I’ve heard of this) and python-based push messages to a smartphone. A more simple approach that I’ve actually tried (sucessful) is the simple python script here (read till the second page) that takes photos and compares them pixel-by-pixel.

Sidenote: OpenCV also appears to be an option for more advanced users.

Closing remarks: I now put a lot of effort into gathering all the resources I’ve found. If something is incorrect or missing, please let me know.

Update on Nov. 2nd 2016: I wanted to look up how to open the picam rtsp stream in VLC and found it missing in my blogpost. Now it’s added.

Update on Nov. 12th 2016: Added the install command for gstreamer1.0

40 thoughts on “Raspberry pi: Video and Audio recording and streaming guide

  1. Great job! Just trying to get the picam going. However i’m getting an issue which I think may be due to the fact I’m using a USB webcam. Do you know if picam is strictly for the raspberry pi camera?

    Thanks

    1. Hi Jon,
      as said in the beginning, these ressources apply mostly for the pi camera modules – in the case of picam, it appears to be module only. For your USB webcam, UV4L might be a good solution worth investigating – see http://www.linux-projects.org/uv4l/ – I think they also have tutorials on their site.
      Otherwise, try googeling the topic. I’ve seen solutions with motion ( see e.g. http://www.instructables.com/id/Raspberry-Pi-remote-webcam/ – not sure if sound is supported) and with ffmpeg (you may have to compile it yourself, but I believe there are precompiled binaries available these days).
      Hope this helps.
      Cheers, Daniel

  2. Great write-up! I’m trying to do something similar (livestream audio+video). Did you end up using Gstreamer/ffmpeg/uv4l in your final solution? Or just picam?

  3. I’ve been using motion for a while with webcams, very happy with it. I see a config file setting for using either a USB cam or the Pi cam. Is it safe to assume the latest version of motion supports the Pi cam and the “enhanced” version is no longer required?

  4. you are awesome, you save my time searching the web, you gathered the optimum solution for video and audio streaming- in my case i will try to use picam too. thnx so much

  5. hi, wonder if anyone can help. getting the following error:

    [1] 858
    pi@PIZEROCAMERA1:~/node-rtsp-rtmp-server $ 2017-02-21 20:33:35.196 attachRecordedDir: dir=file app=file
    /home/pi/node-rtsp-rtmp-server/server.coffee:46
    throw err;
    ^

    Error: bind EADDRINUSE 0.0.0.0:7044
    at Object.exports._errnoException (util.js:907:11)
    at exports._exceptionWithHostPort (util.js:930:20)
    at dgram.js:214:18
    at nextTickCallbackWith3Args (node.js:453:9)
    at process._tickCallback (node.js:359:17)
    at Function.Module.runMain (module.js:443:11)
    at startup (node.js:139:18)
    at node.js:974:3

  6. Hi, I see that you have separate script for audio stream, however I don’t understand why you might need it since picam command does audio/video streaming already. Can you please explain its need.

    audiostream.sh:
    sleep 5
    cvlc -vvv alsa://hw:1,0 –sout \
    ‘#transcode{acodec=mp3,ab=128}:standard{access=http,mux=ts,dst=:8000}’ &

    1. Hi Imran,
      I have a few usecases where I only want to stream audio without the video. In this case, the audiostream.sh-script is used.
      Cheers, Daniel

  7. Ah, okay.
    Your blog saved me countless hours 🙂 loads of thanks for this detailed write-up.
    I have PiCam with Node RTSP-RTPM-SERVER running on RPi Zero W, and it works quite fine though due to H/W constraints sometimes it just crashes 😦 intermittently

    I need some more advice from you. I need to stream audio/video to a browser not using vlc player/plugin, so far I see that Picam does support HLS, however node-rtsp-rtmp-server does not support HLS publishing. Is my understanding correct?

    I think of 2 options:
    1) Using RTSP player for HTML such as this https://www.npmjs.com/package/html5_rtsp_player

    2) Using nginx server setup as explained in PiCam readme

    How would you go about publishing live video to browser based clients? Any advice/direction.

    Regards,
    Imran

    1. Hey Imran,
      I can’t really help you here, I haven’t tried anything beyond the article and have no experience with http servers and streaming. Depending on the effort you want to put into this, using Youtube for the stream distribution might be a conventient option. I’m not that sure if you can use PiCam to connect to Youtube (assuming you want to include audio), so you have to figure that out yourself.
      See these articles as a starting point:

      https://blog.alexellis.io/live-stream-with-docker/
      https://www.digikey.com/en/maker/blogs/969a7932d47d42a79ba72c81da4d9b66
      Just to be clear, I haven’t tried any of those. Most importantly, I can not give any advice regarding privacy and security implications of streams (e.g. who else will be able to watch them).
      Cheers, Daniel

  8. Imran, an absolutely superb write up and piece of software integration. I have one question. Is it possible to use a port other than 80 for the rtsp stream? If so in which configuration file would I need to make a change. Many thanks in advance.

    Best regards

    Al

    1. Hi Alan,
      if you’re using picam, have a look at the config.coffee file, the line
      serverPort: 80
      should do the trick. If you want to go beyond that, I cannot help.
      Cheers, Daniel

      1. Thanks for the input but changing the port from 80 in the config.coffee file just killed the stream altogether. It’s a great shame because this does everything I need with the exception of the rtsp port. I need to get it to a 55nn range. But thanks again for the input.

  9. Just a note. I was using the image and to change the port needed to change the config.js file. Worked like a dream. Information provided by Nao.

  10. hey there, thx for this priceless tutorial, it works!!
    i would like to stream rtsp audio and video over the internet (4g stick connected to Pi). i tried out port forwarding and changed the port in the config file but the server is not reacting to a connection (no viewer). is there a option to get this work over internet?
    greetings from germany

    1. Hey lyr1,
      I haven’t tried any of this, but a general direction should be:
      – make sure you use an http / TCP-compatible stream
      – try this first in your own network, once it works you need to configure firewalls and maybe set up suitable IPs (not sure how any of this works – a different option might be to use youtube for stream distribution. I’ve seen some tutorials on that online…)
      – also check bandwidth and make sure that your stream uses less than what your mobile connection upload allows…
      Good luck!
      Also Greetings from Germany, Daniel

  11. Hi Daniel

    I’m trying to run picam and record a .ts MPEG-TS file as a script that automatically runs on booting up the PI. However, from what has been observed, the
    ./picam –alsadev hw:1,0 -w 1920 -h 1080 -f 30 &
    starts the microphone and the camera, but the execution stalls printing
    “Configuring devices”
    “Started Capturing”
    Upon running the ./script.sh again
    It starts recording and writing to a .ts file.

    The .script.sh file is as under

    ./picam –alsadev hw:1,0 -w 1920 -h 1080 -f 30 &
    /usr/bin/touch hooks/start_record
    sleep 300
    /usr/bin/touvh hoks/stop_record

    1. Hi Sakshi,
      see these two remarks from my blog post:
      “Sidenotes: It appears to be helpful to add a short delay after boot to make sure everything is set up. Also, the home directory seems to be not yet available so I use /home/pi instead of ~. ”
      Does any of this help? I don’t really have the time to try this on my own right now…
      Greetings, Daniel

      1. Thanks for your revert Daniel. Much appreciated 🙂
        Apologies for the delay on my end. I could finally make it work as a bash script, by adding a she-bang at the beginning of the script. 🙂

  12. Hi
    Thanks for the amazingly useful post!!
    I’m trying to extract frames from the video that is being captured by picam (or may be capture frames in a seperate thread altogether) to perform some image processing with OpenCV. Could you please help me with the details as to how can it be achieved

    Shall be highly grateful 🙂

  13. Hi Daniel

    Thanks for your revert. So, as suggested the video (.ts) will be recorded first and then it can be converted to images. Correct? However I need to do both these tasks simultaneously (something like, the video is being recorded in one thread, while the second thread grabs frame and does some image processing on them)
    Is it possible to stream video through tcpout and then convert that stream to image frames?

    1. Hi Sakshi,
      let me be very clear that this is beyond my knowledge and I’m only taking a wild guess here. But probably this can be done both with ffmpeg. You might either
      – install ffmpeg and try it for yourself (a quick google search indicated that you still have to compile it, but there are good explanations) or
      – first ask for directions on some ffmpeg-related help site to make sure it can actually do this.
      (both are complementary steps, but you can decide on your own on which one to begin with)
      Good luck!
      Daniel

  14. Hi Daniel
    Hope you are doing well. Thanks to Mr. Nao’s picam library and your useful post, I’m able to run my application with great ease.

    The only concern I now face is that the audio recording stops after some time (say hours), giving the error “microphone buffer overrun: Trying to recover from the error”. I guess this has something to do with the microphone I’m using or (may be the alsamixer settings, not sure)
    Although this happens once in a while, but even then I have to reboot whenever this occurs to make sure both audio and video are being captured properly (upon the audio error, the microphone stops capturing)

    Moreover, I’m using the mini USB microphone from Adafruit.

    Have you come across this issue anytime? Any idea how can it be resolved?
    Also, I believe the USB microphone remains ON all the time (since its USB powered, so If I can find some way to disabel it (and enable only when needed, it would help combat the issue). As yet, i have no idea if disabling the microphone is even possible.

  15. Good Work!

    Unfortunately my sort of a try didn’t work.

    I installed:
    picam-1.4.6-binary-stretch.tar.xz from https://github.com/iizukanao/picam/releases
    and had to update the node server to v9.11.1
    I added the coffee script with sudo npm install coffee-script -g

    When I try to start the server i get

    pi@raspberrypi:~/node-rtsp-rtmp-server $ ./start_server.sh &
    [1] 918

    first and after a few seconds it says

    pi@raspberrypi:~/node-rtsp-rtmp-server $ /home/pi/node-rtsp-rtmp-server/mp4.coffee:915:16: error: unexpected newline
    obj = super

    If I try to start picam then it says (of course, because the server quitted):

    pi@raspberrypi:~/picam $ error: failed to connect to video data socket (/tmp/node_rtsp_rtmp_videoData): No such file or directory
    perhaps RTSP server (https://github.com/iizukanao/node-rtsp-rtmp-server) is not running?

    Do you have an idea of solving this problem?

    Cheers,
    Thomas

    1. I downloaded the image file, it ran ok. You can tweak the configuration, ports on the config.js file (I think).

      Hope this helps

      Al

  16. Hallo Daniel,

    Thanks a lot for this Topic , i want to know how can i use this steps with the Jetson Nano ?

  17. This page is the most amazing collection of helpful advice directed to exactly the use cases I was needing. You read my mind and wrote it up. Thank you!!!!

Leave a reply to Bhem Cancel reply