https://wiki.pine64.org/api.php?action=feedcontributions&user=Newton688&feedformat=atomPINE64 - User contributions [en]2024-03-29T10:35:25ZUser contributionsMediaWiki 1.37.1https://wiki.pine64.org/index.php?title=PineCube&diff=11590PineCube2021-10-08T17:44:03Z<p>Newton688: /* PineCube as a security camera with Motion */</p>
<hr />
<div>[[File:PineCube.jpg|400px|thumb|right|The PineCube]]<br />
<br />
The '''PineCube''' is a small, low-powered, open source IP camera. Whether you’re a parent looking for a FOSS baby-camera, a privacy oriented shop keeper, home owner looking for a security camera, or perhaps a tinkerer needing a camera for your drone – the CUBE can be the device for you. It featzres an 5MPx Omnivision sensor and IR LEDs for night vision, as well as Power Over Ethernet, as well as a microphone.<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen ( RB043H40T03A-IPS or DFC-XS4300240 V01 )<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
** IR LEDs for night vision<br />
** Passive infrared sensor<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| 5.12<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| v2021.04<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and terminate any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== PineCube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /lib/systemd/system/motion.service<br />
<br />
[Service]<br />
Type=simple<br />
User=motion<br />
ExecStartPre=/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <=- Add this line here with the mode that the camera will use<br />
ExecStart=/usr/bin/motion<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_palette, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
JPEG_1X8/1280x720@1/15<br />
<br />
== PineCube as a WiFi AP ==<br />
<br />
If the PineCube will have a wired ethernet connection to the main network it is possible to use it as a WiFi access point, possibly extending your existing network to further outside. Here are some steps you can take to do this starting from an Armbian system as a starting point. Note that you may need to upgrade your kernel to 5.13.x for this to work well.<br />
<br />
* Install bridge-utils package using apt-get<br />
* Add the following to your /etc/network/interfaces to set up both the eth0 ethernet interface and the br0 bridge interface (change br0 to manual if static IP is preferred)<br />
<br />
/etc/network/interfaces:<br />
auto eth0<br />
iface eth0 inet manual<br />
pre-up /sbin/ifconfig $IFACE up<br />
pre-down /sbin/ifconfig $IFACE down<br />
auto br0<br />
iface br0 inet dhcp<br />
bridge_ports eth0<br />
bridge_stp on<br />
<br />
* Edit the /etc/default/hostapd uncommenting the line with 'DAEMON_CONF="/etc/hostapd.conf"'<br />
* Edit the /etc/hostapd.conf to set the SSID, password and channel for your AP.<br />
* Run <code>sudo systemctl enable hostapd.service</code> to enable the hostapd service on startup<br />
* Reboot your cube with the ethernet cable connected<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
[[File:Pinecube_webcam1.jpg|400px]] [[File:Pinecube_webcam2.jpg|400px]]<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Instructions for sunxi and ethernet gadget: https://linux-sunxi.org/USB_Gadget/Ethernet<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== PineCube as a recorder for loud noises ==<br />
<br />
If you have a kernel that has the sound support (see the Sound Control section) then you can use it to make recordings when there is a noise above a certain threshold. The following script is a very simple example that uses the alsa-utils and the sox command to do this. You can use the noise-stats.txt file and some noise testing to figure out a good threshold for your camera.<br />
<br />
#!/bin/bash<br />
<br />
# Directory where the sound recordings should go<br />
NOISE_FILE_DIR="/root/noises"<br />
<br />
# Threshold to use with the mean delta to decide to preserve the recording<br />
MEAN_DELTA_THRESHOLD="0.002"<br />
<br />
# Sample length (in seconds)<br />
SAMPLE_LENGTH="10"<br />
<br />
while :<br />
do<br />
stats=$(arecord -d "$SAMPLE_LENGTH" -f S16_LE > /tmp/sample.wav 2>/dev/null && sox -t .wav /tmp/sample.wav -n stat 2>&1 | grep 'Mean delta:' | cut -d: -f2 | sed 's/^[ ]*//')<br />
ts=$(date +%s)<br />
if (( $(echo "$stats > $MEAN_DELTA_THRESHOLD" | bc -l) )); then<br />
mv /tmp/sample.wav "$NOISE_FILE_DIR/noise-$ts.wav" # TODO convert to mp3<br />
fi<br />
rm -f /tmp/sample.wav<br />
echo "$ts $stats" >> noise-stats.txt<br />
done<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Focus ===<br />
<br />
The focus of the lens can be manually adjusted through rotation. Note that initially, the lens could be tight.<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this may be inverted depending on the version of the kernel you have)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== Sound Controls ==<br />
<br />
Note that sound is only currently available with special patches on top of a 5.13.13 or higher kernel with Armbian or NixOS. Once you have a kernel that supports sound you can install alsa-utils to get the alsamixer tool. The following mixer settings have been found to work with both playback and record. Note that you'll need to press F5 to get the capture controls and space bar to turn on/off capture for a device. The speaker dangles on a wire from the device. The microphone is located about 1cm below the lens on the front facing circuit board.<br />
<br />
[[File:Pinecube_sound_mixer.png|800px]]<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
{| class="wikitable"<br />
!<br />
! Project Homepage<br />
! Project Source<br />
! PineCube Implementations<br />
<br />
|-<br />
! openWRT<br />
| JuanEst<br />
| [https://github.com/juanesf/packages/tree/53c107ec2734f4bfa73faab78242b1b3745ffd7d/multimedia/mjpg-streamer juansef packaes]<br />
| [https://forum.pine64.org/showthread.php?tid=13158&pid=98379#pid98379 Pine64 Forum thread]<br />
|}<br />
<br />
|}<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11338PineCube2021-09-06T13:10:24Z<p>Newton688: /* PineCube as a recorder for loud noises */</p>
<hr />
<div>[[File:PineCube.jpg|400px|thumb|right|The PineCube]]<br />
<br />
The '''PineCube''' is a small, low-powered, open source IP camera. Whether you’re a parent looking for a FOSS baby-camera, a privacy oriented shop keeper, home owner looking for a security camera, or perhaps a tinkerer needing a camera for your drone – the CUBE can be the device for you. It featzres an 5MPx Omnivision sensor and IR LEDs for night vision, as well as Power Over Ethernet, as well as a microphone.<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen ( RB043H40T03A-IPS or DFC-XS4300240 V01 )<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== PineCube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /lib/systemd/system/motion.service<br />
<br />
[Service]<br />
Type=simple<br />
User=motion<br />
ExecStartPre=/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <=- Add this line here with the mode that the camera will use<br />
ExecStart=/usr/bin/motion<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_palette, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== PineCube as a WiFi AP ==<br />
<br />
If the PineCube will have a wired ethernet connection to the main network it is possible to use it as a WiFi access point, possibly extending your existing network to further outside. Here are some steps you can take to do this starting from an Armbian system as a starting point. Note that you may need to upgrade your kernel to 5.13.x for this to work well.<br />
<br />
* Install bridge-utils package using apt-get<br />
* Add the following to your /etc/network/interfaces to set up both the eth0 ethernet interface and the br0 bridge interface (change br0 to manual if static IP is preferred)<br />
<br />
/etc/network/interfaces:<br />
auto eth0<br />
iface eth0 inet manual<br />
pre-up /sbin/ifconfig $IFACE up<br />
pre-down /sbin/ifconfig $IFACE down<br />
auto br0<br />
iface br0 inet dhcp<br />
bridge_ports eth0<br />
bridge_stp on<br />
<br />
* Edit the /etc/default/hostapd uncommenting the line with 'DAEMON_CONF="/etc/hostapd.conf"'<br />
* Edit the /etc/hostapd.conf to set the SSID, password and channel for your AP.<br />
* Run <code>sudo systemctl enable hostapd.service</code> to enable the hostapd service on startup<br />
* Reboot your cube with the ethernet cable connected<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
[[File:Pinecube_webcam1.jpg|400px]] [[File:Pinecube_webcam2.jpg|400px]]<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Instructions for sunxi and ethernet gadget: https://linux-sunxi.org/USB_Gadget/Ethernet<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== PineCube as a recorder for loud noises ==<br />
<br />
If you have a kernel that has the sound support (see the Sound Control section) then you can use it to make recordings when there is a noise above a certain threshold. The following script is a very simple example that uses the alsa-utils and the sox command to do this. You can use the noise-stats.txt file and some noise testing to figure out a good threshold for your camera.<br />
<br />
#!/bin/bash<br />
<br />
# Directory where the sound recordings should go<br />
NOISE_FILE_DIR="/root/noises"<br />
<br />
# Threshold to use with the mean delta to decide to preserve the recording<br />
MEAN_DELTA_THRESHOLD="0.002"<br />
<br />
# Sample length (in seconds)<br />
SAMPLE_LENGTH="10"<br />
<br />
while :<br />
do<br />
stats=$(arecord -d "$SAMPLE_LENGTH" -f S16_LE > /tmp/sample.wav 2>/dev/null && sox -t .wav /tmp/sample.wav -n stat 2>&1 | grep 'Mean delta:' | cut -d: -f2 | sed 's/^[ ]*//')<br />
ts=$(date +%s)<br />
if (( $(echo "$stats > $MEAN_DELTA_THRESHOLD" | bc -l) )); then<br />
mv /tmp/sample.wav "$NOISE_FILE_DIR/noise-$ts.wav" # TODO convert to mp3<br />
fi<br />
rm -f /tmp/sample.wav<br />
echo "$ts $stats" >> noise-stats.txt<br />
done<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Focus ===<br />
<br />
The focus of the lens can be manually adjusted through rotation. Note that initially, the lens could be tight.<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this may be inverted depending on the version of the kernel you have)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== Sound Controls ==<br />
<br />
Note that sound is only currently available with special patches on top of a 5.13.13 or higher kernel with Armbian or NixOS. Once you have a kernel that supports sound you can install alsa-utils to get the alsamixer tool. The following mixer settings have been found to work with both playback and record. Note that you'll need to press F5 to get the capture controls and space bar to turn on/off capture for a device. The speaker dangles on a wire from the device. The microphone is located about 1cm below the lens on the front facing circuit board.<br />
<br />
[[File:Pinecube_sound_mixer.png|800px]]<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11308PineCube2021-09-05T22:06:15Z<p>Newton688: </p>
<hr />
<div>[[File:PineCube.jpg|400px|thumb|right|The PineCube]]<br />
<br />
The '''PineCube''' is a small, low-powered, open source IP camera. Whether you’re a parent looking for a FOSS baby-camera, a privacy oriented shop keeper, home owner looking for a security camera, or perhaps a tinkerer needing a camera for your drone – the CUBE can be the device for you. It featzres an 5MPx Omnivision sensor and IR LEDs for night vision, as well as Power Over Ethernet, as well as a microphone.<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen ( RB043H40T03A-IPS or DFC-XS4300240 V01 )<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== PineCube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /lib/systemd/system/motion.service<br />
<br />
[Service]<br />
Type=simple<br />
User=motion<br />
ExecStartPre=/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <=- Add this line here with the mode that the camera will use<br />
ExecStart=/usr/bin/motion<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_palette, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== PineCube as a WiFi AP ==<br />
<br />
If the PineCube will have a wired ethernet connection to the main network it is possible to use it as a WiFi access point, possibly extending your existing network to further outside. Here are some steps you can take to do this starting from an Armbian system as a starting point. Note that you may need to upgrade your kernel to 5.13.x for this to work well.<br />
<br />
* Install bridge-utils package using apt-get<br />
* Add the following to your /etc/network/interfaces to set up both the eth0 ethernet interface and the br0 bridge interface (change br0 to manual if static IP is preferred)<br />
<br />
/etc/network/interfaces:<br />
auto eth0<br />
iface eth0 inet manual<br />
pre-up /sbin/ifconfig $IFACE up<br />
pre-down /sbin/ifconfig $IFACE down<br />
auto br0<br />
iface br0 inet dhcp<br />
bridge_ports eth0<br />
bridge_stp on<br />
<br />
* Edit the /etc/default/hostapd uncommenting the line with 'DAEMON_CONF="/etc/hostapd.conf"'<br />
* Edit the /etc/hostapd.conf to set the SSID, password and channel for your AP.<br />
* Run <code>sudo systemctl enable hostapd.service</code> to enable the hostapd service on startup<br />
* Reboot your cube with the ethernet cable connected<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
[[File:Pinecube_webcam1.jpg|400px]] [[File:Pinecube_webcam2.jpg|400px]]<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Instructions for sunxi and ethernet gadget: https://linux-sunxi.org/USB_Gadget/Ethernet<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== PineCube as a recorder for loud noises ==<br />
<br />
If you have a kernel that has the sound support (see the Sound Control section) then you can use it to make recordings when there is a noise above a certain threshold. The following script is a very simple example that uses the alsa-utils and the sox command to do this. You can use the noise-stats.txt file and some noise testing to figure out a good threshold for your camera.<br />
<br />
#!/bin/bash<br />
<br />
# Directory where the sound recordings should go<br />
NOISE_FILE_DIR="/root/noises"<br />
<br />
# Threshold to use with the mean delta to decide to preserve the recording<br />
MEAN_DELTA_THRESHOLD="0.002"<br />
<br />
# Sample length (in seconds)<br />
SAMPLE_LENGTH="10"<br />
<br />
while :<br />
do<br />
stats=$(arecord -d "$SAMPLE_LENGTH" -f S16_LE > /tmp/sample.wav 2>/dev/null && sox -t .wav /tmp/sample.wav -n stat 2>&1 | grep 'Mean delta:' | cut -d: -f2 | sed 's/^[ ]*//')<br />
ts=$(date +%s)<br />
if (( $(echo "$stats > $MEAN_DELTA_THRESHOLD" | bc -l) )); then<br />
mv /tmp/sample.wav "$NOIS_FILE_DIR/noise-$ts.wav" # TODO convert to mp3<br />
fi<br />
rm -f /tmp/sample.wav<br />
echo "$ts $stats" >> noise-stats.txt<br />
done<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Focus ===<br />
<br />
The focus of the lens can be manually adjusted through rotation. Note that initially, the lens could be tight.<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this may be inverted depending on the version of the kernel you have)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== Sound Controls ==<br />
<br />
Note that sound is only currently available with special patches on top of a 5.13.13 or higher kernel with Armbian or NixOS. Once you have a kernel that supports sound you can install alsa-utils to get the alsamixer tool. The following mixer settings have been found to work with both playback and record. Note that you'll need to press F5 to get the capture controls and space bar to turn on/off capture for a device. The speaker dangles on a wire from the device. The microphone is located about 1cm below the lens on the front facing circuit board.<br />
<br />
[[File:Pinecube_sound_mixer.png|800px]]<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11289PineCube2021-09-04T23:59:04Z<p>Newton688: /* Sound Controls */</p>
<hr />
<div>[[File:PineCube.jpg|400px|thumb|right|The PineCube]]<br />
<br />
The '''PineCube''' is a small, low-powered, open source IP camera. Whether you’re a parent looking for a FOSS baby-camera, a privacy oriented shop keeper, home owner looking for a security camera, or perhaps a tinkerer needing a camera for your drone – the CUBE can be the device for you. It featzres an 5MPx Omnivision sensor and IR LEDs for night vision, as well as Power Over Ethernet, as well as a microphone.<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen ( RB043H40T03A-IPS or DFC-XS4300240 V01 )<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== PineCube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /lib/systemd/system/motion.service<br />
<br />
[Service]<br />
Type=simple<br />
User=motion<br />
ExecStartPre=/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <=- Add this line here with the mode that the camera will use<br />
ExecStart=/usr/bin/motion<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_palette, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== PineCube as a WiFi AP ==<br />
<br />
If the PineCube will have a wired ethernet connection to the main network it is possible to use it as a WiFi access point, possibly extending your existing network to further outside. Here are some steps you can take to do this starting from an Armbian system as a starting point. Note that you may need to upgrade your kernel to 5.13.x for this to work well.<br />
<br />
* Install bridge-utils package using apt-get<br />
* Add the following to your /etc/network/interfaces to set up both the eth0 ethernet interface and the br0 bridge interface (change br0 to manual if static IP is preferred)<br />
<br />
/etc/network/interfaces:<br />
auto eth0<br />
iface eth0 inet manual<br />
pre-up /sbin/ifconfig $IFACE up<br />
pre-down /sbin/ifconfig $IFACE down<br />
auto br0<br />
iface br0 inet dhcp<br />
bridge_ports eth0<br />
bridge_stp on<br />
<br />
* Edit the /etc/default/hostapd uncommenting the line with 'DAEMON_CONF="/etc/hostapd.conf"'<br />
* Edit the /etc/hostapd.conf to set the SSID, password and channel for your AP.<br />
* Run <code>sudo systemctl enable hostapd.service</code> to enable the hostapd service on startup<br />
* Reboot your cube with the ethernet cable connected<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
[[File:Pinecube_webcam1.jpg|400px]] [[File:Pinecube_webcam2.jpg|400px]]<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Instructions for sunxi and ethernet gadget: https://linux-sunxi.org/USB_Gadget/Ethernet<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Focus ===<br />
<br />
The focus of the lens can be manually adjusted through rotation. Note that initially, the lens could be tight.<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this may be inverted depending on the version of the kernel you have)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== Sound Controls ==<br />
<br />
Note that sound is only currently available with special patches on top of a 5.13.13 or higher kernel with Armbian or NixOS. Once you have a kernel that supports sound you can install alsa-utils to get the alsamixer tool. The following mixer settings have been found to work with both playback and record. Note that you'll need to press F5 to get the capture controls and space bar to turn on/off capture for a device. The speaker dangles on a wire from the device. The microphone is located about 1cm below the lens on the front facing circuit board.<br />
<br />
[[File:Pinecube_sound_mixer.png|800px]]<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=File:Pinecube_sound_mixer.png&diff=11288File:Pinecube sound mixer.png2021-09-04T23:57:42Z<p>Newton688: </p>
<hr />
<div></div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11287PineCube2021-09-04T23:57:18Z<p>Newton688: </p>
<hr />
<div>[[File:PineCube.jpg|400px|thumb|right|The PineCube]]<br />
<br />
The '''PineCube''' is a small, low-powered, open source IP camera. Whether you’re a parent looking for a FOSS baby-camera, a privacy oriented shop keeper, home owner looking for a security camera, or perhaps a tinkerer needing a camera for your drone – the CUBE can be the device for you. It featzres an 5MPx Omnivision sensor and IR LEDs for night vision, as well as Power Over Ethernet, as well as a microphone.<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen ( RB043H40T03A-IPS or DFC-XS4300240 V01 )<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== PineCube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /lib/systemd/system/motion.service<br />
<br />
[Service]<br />
Type=simple<br />
User=motion<br />
ExecStartPre=/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <=- Add this line here with the mode that the camera will use<br />
ExecStart=/usr/bin/motion<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_palette, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== PineCube as a WiFi AP ==<br />
<br />
If the PineCube will have a wired ethernet connection to the main network it is possible to use it as a WiFi access point, possibly extending your existing network to further outside. Here are some steps you can take to do this starting from an Armbian system as a starting point. Note that you may need to upgrade your kernel to 5.13.x for this to work well.<br />
<br />
* Install bridge-utils package using apt-get<br />
* Add the following to your /etc/network/interfaces to set up both the eth0 ethernet interface and the br0 bridge interface (change br0 to manual if static IP is preferred)<br />
<br />
/etc/network/interfaces:<br />
auto eth0<br />
iface eth0 inet manual<br />
pre-up /sbin/ifconfig $IFACE up<br />
pre-down /sbin/ifconfig $IFACE down<br />
auto br0<br />
iface br0 inet dhcp<br />
bridge_ports eth0<br />
bridge_stp on<br />
<br />
* Edit the /etc/default/hostapd uncommenting the line with 'DAEMON_CONF="/etc/hostapd.conf"'<br />
* Edit the /etc/hostapd.conf to set the SSID, password and channel for your AP.<br />
* Run <code>sudo systemctl enable hostapd.service</code> to enable the hostapd service on startup<br />
* Reboot your cube with the ethernet cable connected<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
[[File:Pinecube_webcam1.jpg|400px]] [[File:Pinecube_webcam2.jpg|400px]]<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Instructions for sunxi and ethernet gadget: https://linux-sunxi.org/USB_Gadget/Ethernet<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Focus ===<br />
<br />
The focus of the lens can be manually adjusted through rotation. Note that initially, the lens could be tight.<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this may be inverted depending on the version of the kernel you have)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== Sound Controls ==<br />
<br />
Note that sound is only currently available with special patches on top of a 5.13.13 or higher kernel with Armbian or NixOS. Once you have a kernel that supports sound you can install alsa-utils to get the alsamixer tool. The following mixer settings have been found to work with both playback and record. Note that you'll need to press F5 to get the capture controls and space bar to turn on/off capture for a device. The speaker dangles on a wire from the device. The microphone is located about 1cm below the lens on the front facing circuit board.<br />
<br />
[[File:Pinecube_sound_mixer.png|400px]]<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11286PineCube2021-09-04T23:56:02Z<p>Newton688: /* Sound Controls */</p>
<hr />
<div>[[File:PineCube.jpg|400px|thumb|right|The PineCube]]<br />
<br />
The '''PineCube''' is a small, low-powered, open source IP camera. Whether you’re a parent looking for a FOSS baby-camera, a privacy oriented shop keeper, home owner looking for a security camera, or perhaps a tinkerer needing a camera for your drone – the CUBE can be the device for you. It featzres an 5MPx Omnivision sensor and IR LEDs for night vision, as well as Power Over Ethernet, as well as a microphone.<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen ( RB043H40T03A-IPS or DFC-XS4300240 V01 )<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== PineCube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /lib/systemd/system/motion.service<br />
<br />
[Service]<br />
Type=simple<br />
User=motion<br />
ExecStartPre=/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <=- Add this line here with the mode that the camera will use<br />
ExecStart=/usr/bin/motion<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_palette, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== PineCube as a WiFi AP ==<br />
<br />
If the PineCube will have a wired ethernet connection to the main network it is possible to use it as a WiFi access point, possibly extending your existing network to further outside. Here are some steps you can take to do this starting from an Armbian system as a starting point. Note that you may need to upgrade your kernel to 5.13.x for this to work well.<br />
<br />
* Install bridge-utils package using apt-get<br />
* Add the following to your /etc/network/interfaces to set up both the eth0 ethernet interface and the br0 bridge interface (change br0 to manual if static IP is preferred)<br />
<br />
/etc/network/interfaces:<br />
auto eth0<br />
iface eth0 inet manual<br />
pre-up /sbin/ifconfig $IFACE up<br />
pre-down /sbin/ifconfig $IFACE down<br />
auto br0<br />
iface br0 inet dhcp<br />
bridge_ports eth0<br />
bridge_stp on<br />
<br />
* Edit the /etc/default/hostapd uncommenting the line with 'DAEMON_CONF="/etc/hostapd.conf"'<br />
* Edit the /etc/hostapd.conf to set the SSID, password and channel for your AP.<br />
* Run <code>sudo systemctl enable hostapd.service</code> to enable the hostapd service on startup<br />
* Reboot your cube with the ethernet cable connected<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
[[File:Pinecube_webcam1.jpg|400px]] [[File:Pinecube_webcam2.jpg|400px]]<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Instructions for sunxi and ethernet gadget: https://linux-sunxi.org/USB_Gadget/Ethernet<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Focus ===<br />
<br />
The focus of the lens can be manually adjusted through rotation. Note that initially, the lens could be tight.<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this may be inverted depending on the version of the kernel you have)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== Sound Controls ==<br />
<br />
Note that sound is only currently available with special patches on top of a 5.13.13 or higher kernel with Armbian or NixOS. Once you have a kernel that supports sound you can install alsa-utils to get the alsamixer tool. The following mixer settings have been found to work with both playback and record. Note that you'll need to press F5 to get the capture controls and space bar to turn on/off capture for a device. The speaker dangles on a wire from the device. The microphone is located about 1cm below the lens on the front facing circuit board.<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11285PineCube2021-09-04T23:55:38Z<p>Newton688: </p>
<hr />
<div>[[File:PineCube.jpg|400px|thumb|right|The PineCube]]<br />
<br />
The '''PineCube''' is a small, low-powered, open source IP camera. Whether you’re a parent looking for a FOSS baby-camera, a privacy oriented shop keeper, home owner looking for a security camera, or perhaps a tinkerer needing a camera for your drone – the CUBE can be the device for you. It featzres an 5MPx Omnivision sensor and IR LEDs for night vision, as well as Power Over Ethernet, as well as a microphone.<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen ( RB043H40T03A-IPS or DFC-XS4300240 V01 )<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== PineCube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /lib/systemd/system/motion.service<br />
<br />
[Service]<br />
Type=simple<br />
User=motion<br />
ExecStartPre=/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <=- Add this line here with the mode that the camera will use<br />
ExecStart=/usr/bin/motion<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_palette, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== PineCube as a WiFi AP ==<br />
<br />
If the PineCube will have a wired ethernet connection to the main network it is possible to use it as a WiFi access point, possibly extending your existing network to further outside. Here are some steps you can take to do this starting from an Armbian system as a starting point. Note that you may need to upgrade your kernel to 5.13.x for this to work well.<br />
<br />
* Install bridge-utils package using apt-get<br />
* Add the following to your /etc/network/interfaces to set up both the eth0 ethernet interface and the br0 bridge interface (change br0 to manual if static IP is preferred)<br />
<br />
/etc/network/interfaces:<br />
auto eth0<br />
iface eth0 inet manual<br />
pre-up /sbin/ifconfig $IFACE up<br />
pre-down /sbin/ifconfig $IFACE down<br />
auto br0<br />
iface br0 inet dhcp<br />
bridge_ports eth0<br />
bridge_stp on<br />
<br />
* Edit the /etc/default/hostapd uncommenting the line with 'DAEMON_CONF="/etc/hostapd.conf"'<br />
* Edit the /etc/hostapd.conf to set the SSID, password and channel for your AP.<br />
* Run <code>sudo systemctl enable hostapd.service</code> to enable the hostapd service on startup<br />
* Reboot your cube with the ethernet cable connected<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
[[File:Pinecube_webcam1.jpg|400px]] [[File:Pinecube_webcam2.jpg|400px]]<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Instructions for sunxi and ethernet gadget: https://linux-sunxi.org/USB_Gadget/Ethernet<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Focus ===<br />
<br />
The focus of the lens can be manually adjusted through rotation. Note that initially, the lens could be tight.<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this may be inverted depending on the version of the kernel you have)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
=== Sound Controls ===<br />
<br />
Note that sound is only currently available with special patches on top of a 5.13.13 or higher kernel with Armbian or NixOS. Once you have a kernel that supports sound you can install alsa-utils to get the alsamixer tool. The following mixer settings have been found to work with both playback and record. Note that you'll need to press F5 to get the capture controls and space bar to turn on/off capture for a device. The speaker dangles on a wire from the device. The microphone is located about 1cm below the lens on the front facing circuit board.<br />
<br />
<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11230PineCube2021-09-04T12:20:29Z<p>Newton688: /* PineCube as a security camera with Motion */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen ( RB043H40T03A-IPS or DFC-XS4300240 V01 )<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== PineCube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /lib/systemd/system/motion.service<br />
<br />
[Service]<br />
Type=simple<br />
User=motion<br />
ExecStartPre=/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <=- Add this line here with the mode that the camera will use<br />
ExecStart=/usr/bin/motion<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_palette, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== PineCube as a WiFi AP ==<br />
<br />
If the PineCube will have a wired ethernet connection to the main network it is possible to use it as a WiFi access point, possibly extending your existing network to further outside. Here are some steps you can take to do this starting from an Armbian system as a starting point. Note that you may need to upgrade your kernel to 5.13.x for this to work well.<br />
<br />
* Install bridge-utils package using apt-get<br />
* Add the following to your /etc/network/interfaces to set up both the eth0 ethernet interface and the br0 bridge interface (change br0 to manual if static IP is preferred)<br />
<br />
/etc/network/interfaces:<br />
auto eth0<br />
iface eth0 inet manual<br />
pre-up /sbin/ifconfig $IFACE up<br />
pre-down /sbin/ifconfig $IFACE down<br />
auto br0<br />
iface br0 inet dhcp<br />
bridge_ports eth0<br />
bridge_stp on<br />
<br />
* Edit the /etc/default/hostapd uncommenting the line with 'DAEMON_CONF="/etc/hostapd.conf"'<br />
* Edit the /etc/hostapd.conf to set the SSID, password and channel for your AP.<br />
* Run <code>sudo systemctl enable hostapd.service</code> to enable the hostapd service on startup<br />
* Reboot your cube with the ethernet cable connected<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
[[File:Pinecube_webcam1.jpg|400px]] [[File:Pinecube_webcam2.jpg|400px]]<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Instructions for sunxi and ethernet gadget: https://linux-sunxi.org/USB_Gadget/Ethernet<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Focus ===<br />
<br />
The focus of the lens can be manually adjusted through rotation. Note that initially, the lens could be tight.<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this may be inverted depending on the version of the kernel you have)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11225PineCube2021-09-04T01:15:12Z<p>Newton688: /* Camera Adjustments */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== PineCube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /etc/init.d/motion script.<br />
<br />
/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <-- ADD THIS LINE HERE WITH THE DESIRED MODE<br />
log_daemon_msg "Starting $DESC" "$NAME"<br />
if start-stop-daemon --start --oknodo --exec $DAEMON -b --chuid motion ; then<br />
log_end_msg 0<br />
else<br />
log_end_msg 1<br />
RET=1<br />
fi<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_pallete, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== PineCube as a WiFi AP ==<br />
<br />
If the PineCube will have a wired ethernet connection to the main network it is possible to use it as a WiFi access point, possibly extending your existing network to further outside. Here are some steps you can take to do this starting from an Armbian system as a starting point. Note that you may need to upgrade your kernel to 5.13.x for this to work well.<br />
<br />
* Install bridge-utils package using apt-get<br />
* Add the following to your /etc/network/interfaces to set up both the eth0 ethernet interface and the br0 bridge interface (change br0 to manual if static IP is preferred)<br />
<br />
/etc/network/interfaces:<br />
auto eth0<br />
iface eth0 inet manual<br />
pre-up /sbin/ifconfig $IFACE up<br />
pre-down /sbin/ifconfig $IFACE down<br />
auto br0<br />
iface br0 inet dhcp<br />
bridge_ports eth0<br />
bridge_stp on<br />
<br />
* Edit the /etc/default/hostapd uncommenting the line with 'DAEMON_CONF="/etc/hostapd.conf"'<br />
* Edit the /etc/hostapd.conf to set the SSID, password and channel for your AP.<br />
* Run <code>sudo systemctl enable hostapd.service</code> to enable the hostapd service on startup<br />
* Reboot your cube with the ethernet cable connected<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
[[File:Pinecube_webcam1.jpg|400px]] [[File:Pinecube_webcam2.jpg|400px]]<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Instructions for sunxi and ethernet gadget: https://linux-sunxi.org/USB_Gadget/Ethernet<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Focus ===<br />
<br />
The focus of the lens can be manually adjusted through rotation. Note that initially, the lens could be tight.<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this may be inverted depending on the version of the kernel you have)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11203PineCube2021-08-31T02:29:11Z<p>Newton688: </p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== PineCube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /etc/init.d/motion script.<br />
<br />
/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <-- ADD THIS LINE HERE WITH THE DESIRED MODE<br />
log_daemon_msg "Starting $DESC" "$NAME"<br />
if start-stop-daemon --start --oknodo --exec $DAEMON -b --chuid motion ; then<br />
log_end_msg 0<br />
else<br />
log_end_msg 1<br />
RET=1<br />
fi<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_pallete, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== PineCube as a WiFi AP ==<br />
<br />
If the PineCube will have a wired ethernet connection to the main network it is possible to use it as a WiFi access point, possibly extending your existing network to further outside. Here are some steps you can take to do this starting from an Armbian system as a starting point. Note that you may need to upgrade your kernel to 5.13.x for this to work well.<br />
<br />
* Install bridge-utils package using apt-get<br />
* Add the following to your /etc/network/interfaces to set up both the eth0 ethernet interface and the br0 bridge interface (change br0 to manual if static IP is preferred)<br />
<br />
/etc/network/interfaces:<br />
auto eth0<br />
iface eth0 inet manual<br />
pre-up /sbin/ifconfig $IFACE up<br />
pre-down /sbin/ifconfig $IFACE down<br />
auto br0<br />
iface br0 inet dhcp<br />
bridge_ports eth0<br />
bridge_stp on<br />
<br />
* Edit the /etc/default/hostapd uncommenting the line with 'DAEMON_CONF="/etc/hostapd.conf"'<br />
* Edit the /etc/hostapd.conf to set the SSID, password and channel for your AP.<br />
* Run <code>sudo systemctl enable hostapd.service</code> to enable the hostapd service on startup<br />
* Reboot your cube with the ethernet cable connected<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
[[File:Pinecube_webcam1.jpg|400px]] [[File:Pinecube_webcam2.jpg|400px]]<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Instructions for sunxi and ethernet gadget: https://linux-sunxi.org/USB_Gadget/Ethernet<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this may be inverted depending on the version of the kernel you have)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11202PineCube2021-08-31T02:17:02Z<p>Newton688: /* Using pinecube as a security camera with Motion */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Pinecube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /etc/init.d/motion script.<br />
<br />
/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <-- ADD THIS LINE HERE WITH THE DESIRED MODE<br />
log_daemon_msg "Starting $DESC" "$NAME"<br />
if start-stop-daemon --start --oknodo --exec $DAEMON -b --chuid motion ; then<br />
log_end_msg 0<br />
else<br />
log_end_msg 1<br />
RET=1<br />
fi<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_pallete, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
[[File:Pinecube_webcam1.jpg|400px]] [[File:Pinecube_webcam2.jpg|400px]]<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Instructions for sunxi and ethernet gadget: https://linux-sunxi.org/USB_Gadget/Ethernet<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this may be inverted depending on the version of the kernel you have)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11167PineCube2021-08-23T18:08:39Z<p>Newton688: /* Camera Controls */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /etc/init.d/motion script.<br />
<br />
/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <-- ADD THIS LINE HERE WITH THE DESIRED MODE<br />
log_daemon_msg "Starting $DESC" "$NAME"<br />
if start-stop-daemon --start --oknodo --exec $DAEMON -b --chuid motion ; then<br />
log_end_msg 0<br />
else<br />
log_end_msg 1<br />
RET=1<br />
fi<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_pallete, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
[[File:Pinecube_webcam1.jpg|400px]] [[File:Pinecube_webcam2.jpg|400px]]<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Instructions for sunxi and ethernet gadget: https://linux-sunxi.org/USB_Gadget/Ethernet<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this may be inverted depending on the version of the kernel you have)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=File:Pinecube_webcam2.jpg&diff=11166File:Pinecube webcam2.jpg2021-08-23T17:09:45Z<p>Newton688: </p>
<hr />
<div></div>Newton688https://wiki.pine64.org/index.php?title=File:Pinecube_webcam1.jpg&diff=11165File:Pinecube webcam1.jpg2021-08-23T17:09:26Z<p>Newton688: </p>
<hr />
<div></div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11164PineCube2021-08-23T17:08:11Z<p>Newton688: </p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /etc/init.d/motion script.<br />
<br />
/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <-- ADD THIS LINE HERE WITH THE DESIRED MODE<br />
log_daemon_msg "Starting $DESC" "$NAME"<br />
if start-stop-daemon --start --oknodo --exec $DAEMON -b --chuid motion ; then<br />
log_end_msg 0<br />
else<br />
log_end_msg 1<br />
RET=1<br />
fi<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_pallete, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
[[File:Pinecube_webcam1.jpg|400px]] [[File:Pinecube_webcam2.jpg|400px]]<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Instructions for sunxi and ethernet gadget: https://linux-sunxi.org/USB_Gadget/Ethernet<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this may be inverted depending on the version of the kernel you have)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera Controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11161PineCube2021-08-22T17:34:52Z<p>Newton688: /* PineCube as a webcam */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /etc/init.d/motion script.<br />
<br />
/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <-- ADD THIS LINE HERE WITH THE DESIRED MODE<br />
log_daemon_msg "Starting $DESC" "$NAME"<br />
if start-stop-daemon --start --oknodo --exec $DAEMON -b --chuid motion ; then<br />
log_end_msg 0<br />
else<br />
log_end_msg 1<br />
RET=1<br />
fi<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_pallete, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Instructions for sunxi and ethernet gadget: https://linux-sunxi.org/USB_Gadget/Ethernet<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this may be inverted depending on the version of the kernel you have)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera Controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11138PineCube2021-08-22T02:34:21Z<p>Newton688: /* Low light mode */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /etc/init.d/motion script.<br />
<br />
/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <-- ADD THIS LINE HERE WITH THE DESIRED MODE<br />
log_daemon_msg "Starting $DESC" "$NAME"<br />
if start-stop-daemon --start --oknodo --exec $DAEMON -b --chuid motion ; then<br />
log_end_msg 0<br />
else<br />
log_end_msg 1<br />
RET=1<br />
fi<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_pallete, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this may be inverted depending on the version of the kernel you have)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera Controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11137PineCube2021-08-22T02:30:24Z<p>Newton688: </p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /etc/init.d/motion script.<br />
<br />
/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <-- ADD THIS LINE HERE WITH THE DESIRED MODE<br />
log_daemon_msg "Starting $DESC" "$NAME"<br />
if start-stop-daemon --start --oknodo --exec $DAEMON -b --chuid motion ; then<br />
log_end_msg 0<br />
else<br />
log_end_msg 1<br />
RET=1<br />
fi<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_pallete, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== PineCube as a webcam ==<br />
<br />
The PineCube can be powered by the host and communicate as a peripheral. First, you'll need to a dual USB-A (male) cable to plug it into your computer. Note that the Micro-USB port can be used only for power because the data lines are not connected.<br />
<br />
TBD:<br />
-Kernel patches applied from here (perhaps already available in NixOS): https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
-Additional patch to pinecube device tree disable ehci0 and ohci0, enabling usb_otg device instead and setting dr_mode to otg<br />
-Add sunxi and g_ether to /etc/modules to get them to load on startup<br />
-Configure the g_ether device to start with a stable MAC address<br />
/etc/modprobe.d/g_ether.conf:<br />
options g_ether host_addr=f6:11:fd:ed:ec:6e<br />
-Set a static IP address for usb0 on startup with network manager (/etc/network/interfaces)<br />
auto usb0<br />
iface usb0 inet static<br />
address 192.168.10.2<br />
netmask 255.255.255.0<br />
-Boot the pinecube plugging it into a computer<br />
-Configure the USB ethernet device on the computer to be in the same subnet as the pinecube<br />
<br />
-Attempt to load the uvc_gadget (usb_f_uvc) or g_webcam<br />
-Look at this project to see if it can bridge UVC gadget output with the v4l from the OV5650 camera sensor<br />
https://github.com/wlhe/uvc-gadget<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this is inverted and might get fixed someday in the Linux Kernel)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera Controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11087PineCube2021-08-17T18:04:42Z<p>Newton688: /* Camera Controls */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /etc/init.d/motion script.<br />
<br />
/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <-- ADD THIS LINE HERE WITH THE DESIRED MODE<br />
log_daemon_msg "Starting $DESC" "$NAME"<br />
if start-stop-daemon --start --oknodo --exec $DAEMON -b --chuid motion ; then<br />
log_end_msg 0<br />
else<br />
log_end_msg 1<br />
RET=1<br />
fi<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_pallete, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this is inverted and might get fixed someday in the Linux Kernel)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera Controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
You can see which flags can be changed and which ones cannot by looking at the flags. The inactive flag indicates that it is currently disabled. Some of these flags are disabled when other flags are turned on. For example, the gain flag above is inactive because gain_automatic is enabled with a value of "1." Note that at the current time '''the auto_exposure flag is inverted, so a value of "0" means on, while "1" means off.''' Maybe the auto_exposure flag will get changed someday. You'll need to turn off auto_exposure (value=1) if you want to manually set the exposure flag.<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=11086PineCube2021-08-17T17:59:20Z<p>Newton688: /* Low light mode */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /etc/init.d/motion script.<br />
<br />
/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <-- ADD THIS LINE HERE WITH THE DESIRED MODE<br />
log_daemon_msg "Starting $DESC" "$NAME"<br />
if start-stop-daemon --start --oknodo --exec $DAEMON -b --chuid motion ; then<br />
log_end_msg 0<br />
else<br />
log_end_msg 1<br />
RET=1<br />
fi<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_pallete, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead (this is inverted and might get fixed someday in the Linux Kernel)<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera Controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=10966PineCube2021-08-04T13:48:36Z<p>Newton688: /* Armbian */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Official Armbian builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /etc/init.d/motion script.<br />
<br />
/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <-- ADD THIS LINE HERE WITH THE DESIRED MODE<br />
log_daemon_msg "Starting $DESC" "$NAME"<br />
if start-stop-daemon --start --oknodo --exec $DAEMON -b --chuid motion ; then<br />
log_end_msg 0<br />
else<br />
log_end_msg 1<br />
RET=1<br />
fi<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_pallete, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera Controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=10965PineCube2021-08-04T12:50:12Z<p>Newton688: /* Mainlining Efforts */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
| Audio Device and IR LED Fix<br />
| https://github.com/danielfullmer/pinecube-nixos/blob/master/kernel/Pine64-PineCube-support.patch<br />
| [https://github.com/danielfullmer/pinecube-nixos/issues/2 TBD]<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /etc/init.d/motion script.<br />
<br />
/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <-- ADD THIS LINE HERE WITH THE DESIRED MODE<br />
log_daemon_msg "Starting $DESC" "$NAME"<br />
if start-stop-daemon --start --oknodo --exec $DAEMON -b --chuid motion ; then<br />
log_end_msg 0<br />
else<br />
log_end_msg 1<br />
RET=1<br />
fi<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_pallete, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera Controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=10963PineCube2021-08-03T22:03:30Z<p>Newton688: /* Camera Adjustments */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /etc/init.d/motion script.<br />
<br />
/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <-- ADD THIS LINE HERE WITH THE DESIRED MODE<br />
log_daemon_msg "Starting $DESC" "$NAME"<br />
if start-stop-daemon --start --oknodo --exec $DAEMON -b --chuid motion ; then<br />
log_end_msg 0<br />
else<br />
log_end_msg 1<br />
RET=1<br />
fi<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_pallete, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
=== Camera Controls ===<br />
<br />
It is possible to adjust the camera using certain internal camera controls, such as contrast, brightness, saturation and more. These controls can be accessed using the v4l2-ctl tool that is part of the v4l-utils package.<br />
<br />
<pre><br />
# List the current values of the controls<br />
v4l2-ctl -d /dev/v4l-subdev* --list-ctrls<br />
<br />
User Controls<br />
<br />
contrast 0x00980901 (int) : min=0 max=255 step=1 default=0 value=0 flags=slider<br />
saturation 0x00980902 (int) : min=0 max=255 step=1 default=64 value=64 flags=slider<br />
hue 0x00980903 (int) : min=0 max=359 step=1 default=0 value=0 flags=slider<br />
white_balance_automatic 0x0098090c (bool) : default=1 value=1 flags=update<br />
red_balance 0x0098090e (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
blue_balance 0x0098090f (int) : min=0 max=4095 step=1 default=0 value=0 flags=inactive, slider<br />
exposure 0x00980911 (int) : min=0 max=65535 step=1 default=0 value=4 flags=inactive, volatile<br />
gain_automatic 0x00980912 (bool) : default=1 value=1 flags=update<br />
gain 0x00980913 (int) : min=0 max=1023 step=1 default=0 value=20 flags=inactive, volatile<br />
horizontal_flip 0x00980914 (bool) : default=0 value=0<br />
vertical_flip 0x00980915 (bool) : default=0 value=0<br />
power_line_frequency 0x00980918 (menu) : min=0 max=3 default=1 value=1<br />
<br />
Camera Controls<br />
<br />
auto_exposure 0x009a0901 (menu) : min=0 max=1 default=0 value=0 flags=update<br />
<br />
Image Processing Controls<br />
<br />
pixel_rate 0x009f0902 (int64) : min=0 max=2147483647 step=1 default=61430400 value=21001200 flags=read-only<br />
test_pattern 0x009f0903 (menu) : min=0 max=4 default=0 value=0<br />
<br />
# Set the contrast controls to the maximum value<br />
v4l2-ctl -d /dev/v4l-subdev* --set-ctrl contrast=255<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=10960PineCube2021-08-02T13:13:50Z<p>Newton688: /* Using pinecube as a security camera with Motion */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the /etc/motion/motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
v4l2_palette 14 # UYVY8<br />
width 640<br />
height 480<br />
framerate 15<br />
<br />
This mode and resolution works fine with Motion and works well with video motion capture (Motion version >= 4.2.2). However, if you want different modes and resolutions you'll need to set the camera to those modes with the media-ctl tool that comes with the v4l-utils package. That will need to be set before the motion service starts. A simple method to ensure that it gets set before motion starts every time, even across reboots, is to make a small modification to the /etc/init.d/motion script.<br />
<br />
/usr/bin/media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1280x720@1/15]' # <-- ADD THIS LINE HERE WITH THE DESIRED MODE<br />
log_daemon_msg "Starting $DESC" "$NAME"<br />
if start-stop-daemon --start --oknodo --exec $DAEMON -b --chuid motion ; then<br />
log_end_msg 0<br />
else<br />
log_end_msg 1<br />
RET=1<br />
fi<br />
<br />
Note that you must modify /etc/motion/motion.conf to match the v4l2_pallete, width, height and framerate to match the mode you set with media-ctl. See the [https://motion-project.github.io/motion_config.html#v4l2_palette Motion documentation] to match the v4l2_palette to the mode. Here are a list of modes that have been tried so far.<br />
<br />
UYVY8_2X8/640x480@1/30<br />
UYVY8_2X8/640x480@1/15<br />
UYVY8_2X8/1280x720@1/15 # This one seems to be fine for live viewing, but causes performance problems when using Motion to capture videos<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=10959PineCube2021-08-02T12:59:55Z<p>Newton688: /* Using pinecube as a security camera with motion */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with Motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid reflection.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
The Motion package can be installed in a variety of Linux flavours. There's a package in the standard Ubuntu and Debian repositories and works with Armbian. It provides a very simple web interface for live viewing of the camera feed and also has motion trigger capabilities to store either still pictures or in later versions videos. Note that it is also possible to build hooks to automatically process or upload those recordings.<br />
<br />
To get things working quickly with motion you can set the following in the motion.conf and start it with "sudo /etc/init.d/motion start"<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=File:Pinecube_outside_mounted.jpg&diff=10958File:Pinecube outside mounted.jpg2021-08-02T12:53:24Z<p>Newton688: </p>
<hr />
<div></div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=10957PineCube2021-08-02T12:48:11Z<p>Newton688: /* Using pinecube as a security camera with motion */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid glare.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]] [[File:Pinecube_outside_mounted.jpg|400px]]<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=10956PineCube2021-08-02T12:47:41Z<p>Newton688: /* Using pinecube as a security camera with motion */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid glare.<br />
<br />
[[File:Pinecube_outside_enclosure.jpg|400px]]<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=File:Pinecube_outside_enclosure.jpg&diff=10955File:Pinecube outside enclosure.jpg2021-08-02T12:47:02Z<p>Newton688: </p>
<hr />
<div></div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=10954PineCube2021-08-02T12:45:59Z<p>Newton688: /* Using pinecube as a security camera with motion */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid glare.<br />
<br />
[[File:IMG_0026.jpg|400px]]<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=10953PineCube2021-08-02T12:45:35Z<p>Newton688: /* Using pinecube as a security camera with motion */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion]. For outside, you'll need an enclosure with a transparent dome to protect from the weather. One suggestion is to mount the camera with the lens as close as possible to the dome to avoid glare.<br />
<br />
[[[File:IMG_026.jpg|400px]]<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=10952PineCube2021-08-02T12:36:11Z<p>Newton688: </p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Using pinecube as a security camera with motion ==<br />
<br />
It's possible to use the pinecube as an inside or outside security camera using [https://motion-project.github.io/index.html motion].<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=10942PineCube2021-07-31T02:43:16Z<p>Newton688: /* Armbian */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot. Note that motion currently takes considerable resources on the pinecube, so you'll want to stop it when doing things like apt upgrade and apt update with <code>systemctl stop motion</code> and then <code>systemctl start motion</code><br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=10941PineCube2021-07-31T02:30:06Z<p>Newton688: /* Armbian */</p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB microSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Set the video settings in /etc/motion/motion.conf to 640x480 15fps YU12. Then just reboot.<br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=9879PineCube2021-04-17T01:04:38Z<p>Newton688: </p>
<hr />
<div>{{under construction}}<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE (''4-18V!'')<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PineCube ( china not open source file) 3D file]<br />
** [https://drive.google.com/file/d/1MDNxnPL2kuYGC4Y4qf9J6YPYZF15KnN7/view?usp=sharing Quick and dirty STL conversion (beta) by doodlebro. Prints and fits at 0.25mm layer height.]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The only Armbian release with support for Ethernet and the camera module is the Ubuntu Groovy release. The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB micoSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Then just reboot.<br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== Camera Adjustments ==<br />
<br />
=== Low light mode ===<br />
<br />
To get imagery in low-light conditions you can turn on the infrared LED's to light up the dark area and also enable the IR cut filter using the commands below. Note that these were performed on Armbian using the instructions from here [https://github.com/danielfullmer/pinecube-nixos#enablingdisabling-ir-cut-filter].<br />
<br />
<pre><br />
# Run these as root<br />
<br />
# Turn on the IR LED lights (note that you can see a faint red glow from them when it's low light)<br />
# Turn them off with echo 1 instead<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led1/brightness<br />
# echo 0 > /sys/class/leds/pine64\:ir\:led2/brightness<br />
<br />
# Export gpio, set direction<br />
# echo 45 > /sys/class/gpio/export<br />
# echo out > /sys/class/gpio/gpio45/direction<br />
<br />
# Enable IR cut filter (note that you can hear the switching noise)<br />
# Disable with echo 0 instead<br />
# echo 1 > /sys/class/gpio/gpio45/value<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=9525PineCube2021-03-20T15:18:14Z<p>Newton688: /* virtual web camera: gstreamer, mjpeg, udp rtp unicast */</p>
<hr />
<div>{{note|1=PAGE UNDER CONSTRUCTION, INFO SUBJECT TO CHANGE}}<br />
<br />
<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PinePhone 3D file]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The only Armbian release with support for Ethernet and the camera module is the Ubuntu Groovy release. The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB micoSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Then just reboot.<br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcam here] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [https://github.com/umlaeute/v4l2loopback/issues/174 max_buffers aren't set] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=9524PineCube2021-03-20T15:15:12Z<p>Newton688: /* virtual web camera: gstreamer, mjpeg, udp rtp unicast */</p>
<hr />
<div>{{note|1=PAGE UNDER CONSTRUCTION, INFO SUBJECT TO CHANGE}}<br />
<br />
<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PinePhone 3D file]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The only Armbian release with support for Ethernet and the camera module is the Ubuntu Groovy release. The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB micoSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Then just reboot.<br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [[https://github.com/johnboiles/obs-mac-virtualcam|here]] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
The most common error found when launching the gstreamer pipeline above is the following error message, which seems to happen when the [[https://github.com/umlaeute/v4l2loopback/issues/174|max_buffers aren't set]] on the v4l2loopback module (see above), or if there is a v4l client (vlc, chromium) already connected to /dev/video10 when starting the pipeline. There does seem to be a small level of instability in this stack that could be improved.<br />
<br />
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:<br />
streaming stopped, reason not-negotiated (-4)<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [[https://www.raspberrypi.org/forums/viewtopic.php?t=270023|problems]] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [[https://meet.jit.si|jitsi]]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=9523PineCube2021-03-20T15:06:49Z<p>Newton688: </p>
<hr />
<div>{{note|1=PAGE UNDER CONSTRUCTION, INFO SUBJECT TO CHANGE}}<br />
<br />
<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PinePhone 3D file]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The only Armbian release with support for Ethernet and the camera module is the Ubuntu Groovy release. The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB micoSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Then just reboot.<br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
=== virtual web camera: gstreamer, mjpeg, udp rtp unicast ===<br />
<br />
It's possible to set up the pinecube as a virtual camera video device (Video 4 Linux) so that you can use it with video conferencing software, such as Jitsi Meet. Note that this has fairly minimal (<1s) lag when tested on a wired 1Gb ethernet network connection and the frame rate is passable. MJPEG is very wasteful in terms of network resources, so this is something to keep in mind. The following instructions assume Debian Linux (Bullseye) as your desktop machine, but could work with other Linux OSes too. It's possible that someday a similar system could work with Mac OS X provided that someone writes a gstreamer plugin that exposes a Mac OS Core Media DAL device as a virtual webcam, like they did [https://github.com/johnboiles/obs-mac-virtualcamhere] for OBS.<br />
<br />
First, you will need to set up the pinecube with gstreamer much like the above gstreamer, but in 1280x720 resolution. Also, you will be streaming to the desktop machine using UDP, with IP address represented by $desktop below at UDP port 8000.<br />
<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]'<br />
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! rtpjpegpay name=pay0 ! udpsink host=$desktop port=8000<br />
<br />
On your desktop machine, you will need to install the gstreamer suite and the special v4l2loopback kernel module to bring the mjpeg stream to the Video 4 Linux device /dev/video10.<br />
<br />
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly v4l2loopback-dkms<br />
sudo modprobe v4l2loopback video_nr=10 max_buffers=32 exclusive_caps=1 # Creates /dev/video10 as a virtual v4l2 device, allocates increased buffers and exposes exclusive capabilities for chromium to find the video device<br />
gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26,framerate=30/1 ! rtpjpegdepay ! jpegdec ! video/x-raw, format=I420, width=1280, height=720 ! autovideoconvert ! v4l2sink device=/dev/video10<br />
<br />
Now that you have /dev/video10 hooked into the gstreamer pipeline you can then connect to it using VLC. VLC is a good local test that things are working. You can view the stream like this. Note that you could do the same thing with mpv/ffmpeg, but there are [https://www.raspberrypi.org/forums/viewtopic.php?t=270023 problems] currently.<br />
<br />
vlc v4l2:///dev/video10<br />
<br />
Be sure to disconnect vlc before trying to use the virtual web camera with chromium. Launch chromium and go to a web conference like [https://meet.jit.si jitsi]. When it prompts you for the camera pick the "Dummy Video Device..." and it should be much like what you see in vlc. Note that firefox isn't really working at this moment and the symptoms appear very similar to the problem with mpv/ffmpeg mentioned above, ie. when they connect to the camera they show only the first frame and then drop. It's unclear whether the bug is in gstreamer, v4l, or ffmpeg (or somewhere in these instructions).<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=9485PineCube2021-03-17T23:26:59Z<p>Newton688: </p>
<hr />
<div>{{note|1=PAGE UNDER CONSTRUCTION, INFO SUBJECT TO CHANGE}}<br />
<br />
<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PinePhone 3D file]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The only Armbian release with support for Ethernet and the camera module is the Ubuntu Groovy release. The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB micoSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Then just reboot.<br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [https://wiki.pine64.org/wiki/PinePhone#Serial_console serial console] USB cable for pinephone and pinebook pro at the [https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/ store]. With a [https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD female terminal block] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=9484PineCube2021-03-17T23:24:35Z<p>Newton688: Add section about using the pinestore's debugging cable for pp and pbp with the pinecube</p>
<hr />
<div>{{note|1=PAGE UNDER CONSTRUCTION, INFO SUBJECT TO CHANGE}}<br />
<br />
<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PinePhone 3D file]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The only Armbian release with support for Ethernet and the camera module is the Ubuntu Groovy release. The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB micoSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Then just reboot.<br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Serial connection using pinephone/pinebook pro serial debugging cable ====<br />
<br />
You can use the [[https://wiki.pine64.org/wiki/PinePhone#Serial_console|serial console]] USB cable for pinephone and pinebook pro at the [[https://pine64.com/product/pinebook-pinephone-pinetab-serial-console/|store]]. With a [[https://www.amazon.com/3-5mm-Stereo-Female-terminal-connector/dp/B077XPSKQD|female terminal block]] wire using breadboard wire into the GPIO block at the following locations in a "null modem" configuration with transmit and receive crossed between your computer and the pinecube:<br />
<br />
S - Ground (to pin 9)<br />
R - Transmit (to pin 8)<br />
T - Receive (to pin 10)<br />
<br />
From Linux you can access the console of the pinecube using the screen command:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=9483PineCube2021-03-17T23:06:36Z<p>Newton688: Added in some new packages to install and steps based on running through the instructions recently.</p>
<hr />
<div>{{note|1=PAGE UNDER CONSTRUCTION, INFO SUBJECT TO CHANGE}}<br />
<br />
<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:'''<br />
** 10/100Mbps Ethernet with passive PoE<br />
** USB 2.0 A host<br />
** 26 pins GPIO port<br />
*** 2x 3.3V Ouptut<br />
*** 2x 5V Output<br />
*** 1x I2C<br />
*** 2x UART<br />
*** 2x PWM<br />
*** 1x SPI<br />
*** 1x eMMC/SDIO/SD (8-bit)<br />
*** 6x Interrupts<br />
*** '''Note: Interfaces are multiplexed, so they can't be all used at same time'''<br />
** Internal microphone<br />
* '''Network:'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
* '''Power DC in:'''<br />
** 5V 1A from MicroUSB Port or GPIO port<br />
** 4V-18V from Ethernet passive PoE<br />
* '''Battery:''' optional 950-1600mAh model: 903048 Lithium Polymer Ion Battery Pack, can be purchase at [https://www.amazon.com/AKZYTUE-1200mAh-Battery-Rechargeable-Connector/dp/B07TWHHCNK/ Amazon.com]<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
** [https://files.pine64.org/doc/cert/PineCube-FCC-SDOC%20certification%20S20072502302001.pdf PineCube FCC Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-CE-EMC%20certification%20S20072502301001.pdf PineCube CE RED Certificate]<br />
** [https://files.pine64.org/doc/cert/PineCube-ROHS%20Test%20Report.pdf PineCube ROHS Test Report]<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/CH-5A-DV-V2.0%20Specification.pdf PineCube Camera Module Specification]<br />
** [https://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [https://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
* GPIO Header Pinout: ([[:File:PineCube_GPIO.pdf|PDF]], [https://pine64.gami.ee/pinecube/gpio-pinout.html HTML]) (Pin1 is marked on the board by a white dot on the PCB)<br />
[[File:PineCube_GPIO_Pinout.png|700px]]<br />
<br />
<br />
* Case information:<br />
** [https://files.pine64.org/doc/PineCube/PineCube%20Case%203D.zip PinePhone 3D file]<br />
<br />
[[File:PineCube_Case-1.jpg|400px]] [[File:PineCube_Case-2.jpg|400px]]<br />
<br />
== Operating Systems ==<br />
<br />
=== Mainlining Efforts ===<br />
<br />
Please note:<br />
<br />
* this list is most likely not complete<br />
* no review of functionality is done here, it only serves as a collection of efforts<br />
<br />
{| class="wikitable"<br />
!colspan="3"|Linux kernel<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| Devicetree Entry Pinecube<br />
| https://lkml.org/lkml/2020/9/22/1241<br />
| 5.10<br />
|-<br />
| Correction for AXP209 driver<br />
| https://lkml.org/lkml/2020/9/22/1243<br />
| 5.9<br />
|-<br />
| Additional Fixes for AXP209 driver<br />
| https://lore.kernel.org/lkml/20201031182137.1879521-8-contact@paulk.fr/<br />
| tdb (5.11?)<br />
|-<br />
| Device Tree Fixes<br />
| https://lore.kernel.org/lkml/20201003234842.1121077-1-icenowy@aosc.io/<br />
| 5.10<br />
|-<br />
!colspan="3"|U-boot<br />
|-<br />
| Type<br />
| Link<br />
| Available in version<br />
|-<br />
| PineCube Board Support<br />
| https://patchwork.ozlabs.org/project/uboot/list/?series=210044<br />
| expected in v2021.01<br />
|-<br />
!colspan="3"|Buildroot<br />
|-<br />
| No known mainlining efforts yet<br />
|<br />
|<br />
|}<br />
<br />
<br />
=== NixOS ===<br />
<br />
* [https://github.com/danielfullmer/pinecube-nixos danielfullmer's Github]<br />
<br />
<br />
<br />
=== Buildroot ===<br />
[https://elimo.io Elimo Engineering] integrated support for the PineCube into Buildroot.<br />
<br />
This has not been merged into upstream Buildroot yet, but you can find the repo on [https://github.com/elimo-engineering/buildroot Elimo's GitHub account] and build instructions in the [https://github.com/elimo-engineering/buildroot/tree/pine64/pinecube/board/pine64/pinecube board support directory] readme.<br />
The most important thing that this provides is support for the S3's DDR3 in u-boot. Unfortunately mainline u-boot does not have that yet, but the u-boot patches from [https://github.com/danielfullmer/pinecube-nixos Daniel Fullmer's NixOS repo] were easy enough to use on buildroot.<br />
This should get you a functional system that boots to a console on UART0. It's pretty fast too, getting there in 1.5 seconds from u-boot to login prompt.<br />
<br />
=== Armbian ===<br />
<br />
The only Armbian release with support for Ethernet and the camera module is the Ubuntu Groovy release. The Ubuntu Groovy release is an experimental, automatically generated release and it appears to support additional hardware from the other Armbian releases.<br />
<br />
<br />
==== Armbian Build Image with motion [microSD Boot] [20201222] ====<br />
* Armbian Ubuntu Focal build for the Pinecube with the motion (detection) package preinstalled.<br />
* There are 2 ways to interact with the OS:<br />
** Scan for the the device IP (with hostname pinecube)<br />
** Use the PINE64 USB SERIAL CONSOLE/PROGRAMMER to login to the serial console, then check for assigned IP<br />
* DD image (for 8GB micoSD card and above)<br />
** [https://files.pine64.org//os/PineCube/armbian/Armbian_21.02.0-trunk_Pinecube_focal_dev_5.10.0.img.xz Direct download from pine64.org]<br />
*** MD5 (XZip file): 61e5a6d3ab0f74ce8367c97b7f8cbb7b<br />
*** File Size: 328MGB<br />
<br />
[https://gist.github.com/Icenowy/ff68f6e4ba8231380d3a295226e63fb3 GitHub gist] for the userpatch which pre-installs and configures the motion (detection) package. <br />
<br />
Armbian Builds for PineCube are [https://www.armbian.com/pinecube/ available for download], once again thanks to [https://github.com/armbian/build/pull/2364/files the work] of Icenowy Zheng.<br />
Although [https://www.armbian.com/download/?device_support=No+official+support+(CSC) not officially supported] it enables the usage of Debian and Ubuntu.<br />
<br />
A serial console can be established with 152008N1 (no hardware flow control). Login credentials are as usual in Armbian (login: root, password: 1234).<br />
<br />
Motion daemon can be enabled using systemctl (With root) <code>systemctl enable motion</code>. Then just reboot.<br />
<br />
==== Serial connection using screen and the woodpecker USB serial device ====<br />
<br />
First connect the woodpecker USB serial device to the PineCube. Pin 1 on the PineCube has a small white dot on the PCB - this should be directly next to the microusb power connection. Attach the GND pin on the woodpecker to pin 6 (GND) on the PineCube, TXD from the woodpecker to pin 10 (UART_RXD) on the PineCube, and RXD from the woodpecker to pin 8 (UART_TXD) on the PineCube.<br />
<br />
On the host system which has the woodpecker USB serial device attached, it is possible to run screen and to communicate directly with the PineCube:<br />
<br />
<code>screen /dev/ttyUSB0 115200</code><br />
<br />
==== Serial connection using screen and Arduino Uno ====<br />
<br />
You can use the Arduino Uno or other Arduino boards as a USB serial device.<br />
<br />
First you must either remove the microcontroller from it's socket, or if your Arduino board does not allow this, then you can use wires to jump RESET (RST) and GND to isolate the SOC.<br />
<br />
After this you can either use the Arduino IDE and it's Serial monitor after selecting your <code>/dev/ttyACMx</code> Arduino device, or screen:<br />
<br />
<code>screen /dev/ttyACM0 115200</code><br />
<br />
[[File:ArduinoSerialPinecube.jpg|400px]]<br />
<br />
==== Basic bandwidth tests with iperf3 ====<br />
<br />
Install armbian-config:<br />
<code>apt install armbian-config</code><br />
<br />
Enable iperf3 through the menu in armbian-config:<br />
<code>armbian-config</code><br />
<br />
On a test computer on the same network segment run iperf3 as a client:<br />
<code>iperf3 -c pinecube -t 60</code><br />
<br />
The same test computer, run iperf3 in the reverse direction:<br />
<code>iperf3 -c pinecube -t 60 -R</code><br />
<br />
=== Performance results ===<br />
<br />
==== Wireless network performance ====<br />
The performance results reflect using the wireless network. The link speed was 72.2Mb/s using 2.462Ghz wireless. Running sixty second iperf3 tests: the observed throughput varies between 28-50Mb/s to a host on the same network segment. The testing host is connected to an Ethernet switch which is also connected to a wireless bridge. The wireless network uses WPA2 and the PineCube is connected to this wireless network bridge.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 293 MBytes 41.0 Mbits/sec 1 sender<br />
[ 5] 0.00-60.01 sec 291 MBytes 40.7 Mbits/sec receiver<br />
<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.85 sec 263 MBytes 36.2 Mbits/sec 3 sender<br />
[ 5] 0.00-60.00 sec 259 MBytes 36.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 230 MBytes 32.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.09 sec 229 MBytes 32.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.14 sec 246 MBytes 34.3 Mbits/sec 7 sender<br />
[ 5] 0.00-60.00 sec 245 MBytes 34.2 Mbits/sec receiver<br />
<br />
====== Wired network performance ======<br />
<br />
The Ethernet network does not work in the current Ubuntu Focal Armbian image or the Ubuntu Groovy Armbian image.<br />
<br />
The performance results reflect using the Ethernet network. The link speed was 100Mb/s using a 1000Mb/s prosumer switch. Running sixty second iperf3 tests: the observed throughput varies between 92-102Mb/s to a host on the same network segment. The testing host is connected to the same Ethernet switch which is also connected to the PineCube.<br />
<br />
Client rate for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 675 MBytes 94.4 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 673 MBytes 94.0 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 673 MBytes 94.1 Mbits/sec receiver<br />
<br />
Using WireGuard to protect the traffic between the PineCube and the test system, the performance characteristics change only slightly.<br />
<br />
Client rate for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.00 sec 510 MBytes 71.2 Mbits/sec 0 sender<br />
[ 5] 0.00-60.01 sec 509 MBytes 71.1 Mbits/sec receiver<br />
<br />
Client rate with -R for sixty seconds with WireGuard:<br />
<br />
[ ID] Interval Transfer Bitrate Retr<br />
[ 5] 0.00-60.01 sec 642 MBytes 89.8 Mbits/sec 0 sender<br />
[ 5] 0.00-60.00 sec 641 MBytes 89.7 Mbits/sec receiver<br />
<br />
== Streaming the camera to the network ==<br />
<br />
In this section we document a variety of ways to stream video to the network from the PineCube. Unless specified otherwise, all of these examples have been tested on Ubuntu groovy (20.10). See [https://github.com/ioerror/pinecube this small project for the pinecube] for easy to use programs tuned for the PineCube.<br />
<br />
In the examples which use h264, we are currently encoding using the x264 library which is not very fast on this CPU. The SoC in the PineCube does have a hardware h264 encoder, which the authors of these examples have so far not tried to use. It appears that https://github.com/gtalusan/gst-plugin-cedar might provide easy access to it, however. Please update this wiki if you find out how to use the hardware encoder!<br />
<br />
=== gstreamer: h264 HLS ===<br />
<br />
HLS (HTTP Live Streaming) has the advantage that it is easy to play in any modern web browser, including Android and iPhone devices, and that it is easy to put an HTTP caching proxy in front of it to scale to many viewers. It has the disadvantages of adding (at minimum) several seconds of latency, and of requiring an h264 encoder (which we have in hardware, but haven't figured out how to use yet, so, we're stuck with the slow software one).<br />
<br />
HLS segments a video stream into small chunks which are stored as .ts (MPEG Transport Stream) files, and (re)writes a playlist.m3u8 file which clients constantly refresh to discover which .ts files they should download. We use a tmpfs file system to avoid needing to write these files to the sdcard in the PineCube. Besides the program which writes the .ts and .m3u8 files (gst-launch-1.0, in our case), we'll also need a very basic web page in tmpfs and a webserver to serve the files.<br />
<br />
Create an hls directory to be shared in the existing tmpfs file system that is mounted at /dev/shm:<br />
<br />
<code>mkdir /dev/shm/hls/</code><br />
<br />
Create an index.html and optionally a favicon.ico or even a set of icons, and then put the files into the /dev/shm/hls directory. An example index.html that works is available in the Getting Started section of the [https://github.com/video-dev/hls.js/#getting-started README] for [https://github.com/video-dev/hls.js/ hls.js]. We recommend downloading the hls.js file and editing the example index.html to serve your local copy of it instead of fetching it from a CDN. This file provides HLS playback capabilities in browsers which don't natively support it (which is most browsers aside from the iPhone).<br />
<br />
In one terminal, run the camera capture pipeline:<br />
<code><br />
cd /dev/shm/hls/ && <br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/240x320@1/15]' && <br />
gst-launch-1.0 v4l2src ! video/x-raw,width=320,height=240,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
Alternatively it is possible to capture at a higher resolution:<br />
<code><br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/1920x1080@1/15]'<br />
cd /dev/shm/hls/ && gst-launch-1.0 v4l2src ! video/x-raw,width=1920,height=1080,format=UYVY,framerate=15/1 ! decodebin ! videoconvert ! video/x-raw,format=I420 ! clockoverlay ! timeoverlay valignment=bottom ! x264enc speed-preset=ultrafast tune=zerolatency ! mpegtsmux ! hlssink target-duration=1 playlist-length=2 max-files=3<br />
</code><br />
<br />
In another, run a simple single threaded webserver which will serve html, javascript, and HLS to web clients:<br />
<code><br />
cd /dev/shm/hls/ && python3 -m http.server<br />
</code><br />
<br />
Alternately, install a more efficient web server (<code>apt install nginx</code>) and set the server root for the default configuration to be /dev/shm/hls. This will run on port 80 rather than the python3 server which defaults to port 8000.<br />
<br />
It should be possible to view the HLS stream directly in a web browser by visiting [http://pinecube:8000/ http://pinecube:8000/] if pinecube is the correct hostname and the name correctly resolves.<br />
<br />
You can also view the HLS stream with VLC: <code>vlc http://pinecube:8000/playlist.m3u8</code><br />
<br />
Or with gst-play-1.0: <code>gst-play-1.0 http://pinecube:8000/playlist.m3u8</code> (or with mpv, ffplay, etc)<br />
<br />
To find out about other options you can configure in the <code>hlssink</code> gstreamer element, you can run <code>gst-inspect-1.0 hlssink</code>.<br />
<br />
It is worth noting here that the <code>hlssink</code> element in GStreamer is not widely used in production environments. It is handy for testing, but for real-world free-software HLS live streaming deployments the standard tool today (January 2021) is nginx's RTMP module which can be used with ffmpeg to produce "adaptive streams" which are reencoded at varying quality levels. You can send data to an nginx-rtmp server from a gstreamer pipeline using the <code>rtmpsink</code> element. It is also worth noting that gstreamer has a new <code>hlssink2</code> element which we have not tested; perhaps in the future it will even have a webserver!<br />
<br />
=== v4l2rtspserver: h264 RTSP ===<br />
<br />
Install dependencies:<br />
<br />
apt install -y cmake gstreamer1.0-plugins-bad gstreamer1.0-tools \<br />
gstreamer1.0-plugins-good v4l-utils gstreamer1.0-alsa alsa-utils libpango1.0-0 \<br />
libpango1.0-dev gstreamer1.0-plugins-base gstreamer1.0-x x264 \<br />
gstreamer1.0-plugins-{good,bad,ugly} liblivemedia-dev liblog4cpp5-dev \<br />
libasound2-dev vlc libssl-dev iotop libasound2-dev liblog4cpp5-dev \<br />
liblivemedia-dev autoconf automake libtool v4l2loopback-dkms liblog4cpp5-dev \<br />
libvpx-dev libx264-dev libjpeg-dev libx265-dev linux-headers-dev-sunxi;<br />
<br />
Install kernel source and build v4l2loopback module:<br />
<br />
apt install linux-source-5.11.3-dev-sunxi64 #Adjust kernel version number to match current installation with "uname -r"<br />
cd /usr/src/v4l2loopback-0.12.3; make && make install && depmod -a<br />
<br />
Build required v4l2 software:<br />
<br />
git clone --recursive https://github.com/mpromonet/v4l2tools && cd v4l2tools && make && make install;<br />
git clone --recursive https://github.com/mpromonet/v4l2rtspserver && cd v4l2rtspserver && cmake -D LIVE555URL=https://download.videolan.org/pub/contrib/live555/live.2020.08.19.tar.gz . && make && make install;<br />
<br />
Running the camera:<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:UYVY8_2X8/640x480@1/30]';<br />
modprobe v4l2loopback video_nr=10 debug=2;<br />
v4l2compress -fH264 -w -vv /dev/video0 /dev/video10 &<br />
v4l2rtspserver -v -S -W 640 -H 480 -F 10 -b /usr/local/share/v4l2rtspserver/ /dev/video10<br />
<br />
Note that you might get an error when running media-ctl indicating that the resource is busy. This could be because of the motion program that runs on the stock OS installation. Check and kill any running /usr/bin/motion processes before running the above steps.<br />
<br />
The v4l2compress/v4l2rtspserver method of streaming the camera uses around ~45-50% of the CPU for compression of the stream into H264 (640x480@7fps) and around 1-2% of the CPU for serving the HLS stream. Total system RAM used is roughly 64MB and the load average is ~0.4-~0.5 when idle, and ~0.51-~0.60 with one HLS client streaming the camera.<br />
<br />
You'll probably see about a 2-3s lag with this approach, possibly due to the H264 compression and the lack of hardware acceleration at the moment.<br />
<br />
=== gstreamer: JPEG RTSP ===<br />
<br />
GStreamer's RTSP server isn't an element you can use with gst-launch, but rather a library. We failed to build its example program, so instead used this very small 3rd party tool which is based on it: https://github.com/sfalexrog/gst-rtsp-launch/<br />
<br />
After building gst-rtsp-launch (which is relatively simple on Ubuntu groovy; just <code>apt install libgstreamer1.0-dev libgstrtspserver-1.0-dev</code> first), you can read JPEG data directly from the camera and stream it via RTSP: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' && gst-rtsp-launch 'v4l2src ! image/jpeg,width=1280,height=720 ! rtpjpegpay name=pay0'</code><br />
<br />
This stream can be played using <code>vlc rtsp://pinecube.local:8554/video</code> or mpv, ffmpeg, gst-play-1.0, etc. If you increase the resolution to 1920x1080, mpv and gst-play can still play it, but VLC will complain <code>The total received frame size exceeds the client's buffer size (2000000). 73602 bytes of trailing data will be dropped!</code> if you don't tell it to increase its buffer size with <code>--rtsp-frame-buffer-size=300000</code>.<br />
<br />
=== gstreamer: h264 RTSP ===<br />
<br />
Left as an exercise to the reader (please update the wiki). Hint: involves bits from the HLS and the JPEG RTSP examples above, but needs a <code>rtph264pay name=pay0</code> element.<br />
<br />
=== gstreamer: JPEG RTP UDP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! udpsink host=$client_ip port=8000</code><br />
<br />
Receive with: <code>gst-launch-1.0 udpsrc port=8000 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
Note that the sender must specify the recipient's IP address in place of <code>$client_ip</code>; this can actually be a multicast address allowing for many receivers! (You'll need to specify a valid multicast address in the receivers' pipeline also; see <code>gst-inspect-1.0 udpsrc</code> and <code>gst-inspect-1.0 udpsink</code> for details.)<br />
<br />
=== gstreamer: JPEG RTP TCP ===<br />
<br />
Configure camera: <code>media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'</code><br />
<br />
Transmit with: <code>gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! rtpjpegpay name=pay0 ! rtpstreampay ! tcpserversink host=0.0.0.0 port=1234</code><br />
<br />
Receive with: <code>gst-launch-1.0 tcpclientsrc host=pinecube.local port=1234 ! application/x-rtp-stream,encoding-name=JPEG ! rtpstreamdepay ! application/x-rtp, media=video, encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink</code><br />
<br />
=== gstreamer and socat: MJPEG HTTP server ===<br />
<br />
This rather ridiculous method uses bash, socat, and gstreamer to implement an HTTP-ish server which will serve your video as an MJPEG stream which is playable in browsers.<br />
<br />
This approach has the advantage of being relatively low latency (under a second), browser-compatible, and not needing to reencode anything on the CPU (it gets JPEG data from the camera itself). Compared to HLS, it has the disadvantages that MJPEG requires more bandwidth than h264 for similar quality, pause and seek are not possible, stalled connections cannot jump ahead when they are unstalled, and, in the case of this primitive implementation, it only supports one viewer at a time. (Though, really, the RTSP examples on this page perform very poorly with multiple viewers, so...)<br />
<br />
Gstreamer can almost do this by itself, as it has a multipartmux element which produces the headers which precede each frame. But sadly, despite various forum posts lamenting the lack of one over the last 12+ years, as of the end of the 50th year of the UNIX era (aka 2020), somehow nobody has yet gotten a webserver element merged in to gstreamer (which is necessary to produce the HTTP response, which is required for browsers other than firefox to play it). So, here is an absolutely minimal "webserver" which will get MJPEG displaying in a (single) browser.<br />
<br />
Create a file called <code>mjpeg-response.sh</code>:<br />
#!/bin/bash<br />
media-ctl --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1920x1080]'<br />
b="--duct_tape_boundary"<br />
echo -en "HTTP/1.1 200 OK\r\nContent-type: multipart/x-mixed-replace;boundary=$b\r\n\r\n"<br />
gst-launch-1.0 v4l2src ! image/jpeg,width=1920,height=1080 ! multipartmux boundary=$b ! fdsink fd=2 2>&1 >/dev/null<br />
<br />
Make it executable: <code>chmod +x mjpeg-response.sh</code><br />
<br />
Run the server: <code>socat TCP-LISTEN:8080,reuseaddr,fork EXEC:./mjpeg-response.sh</code><br />
<br />
And browse to http://pinecube.local:8080/ in your browser.<br />
<br />
== Debugging camera issues with the gstreamer pipeline ==<br />
<br />
If the camera does not appear to work, it is possible to change the <code>v4l2src</code> to <code>videotestsrc</code> and the gstreamer pipeline will produce a synthetic test image without using the camera hardware.<br />
<br />
If the camera is only sensor noise lines over a black or white image, the camera may be in a broken state. When in that state, the following kernel messages were observed:<br />
<pre><br />
[ 1703.577304] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.578570] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.596924] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.598060] alloc_contig_range: [46400, 467f5) PFNs busy<br />
[ 1703.600480] alloc_contig_range: [46400, 468f5) PFNs busy<br />
[ 1703.601654] alloc_contig_range: [46600, 469f5) PFNs busy<br />
[ 1703.619165] alloc_contig_range: [46100, 464f5) PFNs busy<br />
[ 1703.619528] alloc_contig_range: [46200, 465f5) PFNs busy<br />
[ 1703.619857] alloc_contig_range: [46300, 466f5) PFNs busy<br />
[ 1703.641156] alloc_contig_range: [46100, 464f5) PFNs busy<br />
</pre><br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [https://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB<br />
===== How to compile =====<br />
<br />
You can either setup a machine for the build environment, or use a Vagrant virtual machine provided by [https://elimo.io Elimo Engineering]<br />
<br />
====== On a dedicated machine ======<br />
<br />
Recommended system requirements:<br />
* OS: (L)Ubuntu 16.04<br />
* CPU: 64-bit based<br />
* Memory: 8 GB or higher<br />
* Disk: 15 GB free hard disk space<br />
<br />
'''Install required packages'''<br />
<pre><br />
sudo apt-get install p7zip-full git make u-boot-tools libxml2-utils bison build-essential gcc-arm-linux-gnueabi g++-arm-linux-gnueabi zlib1g-dev gcc-multilib g++-multilib libc6-dev-i386 lib32z1-dev<br />
</pre><br />
'''Install older Make 3.82 and Java JDK 6'''<br />
<pre><br />
pushd /tmp<br />
wget https://ftp.gnu.org/gnu/make/make-3.82.tar.gz<br />
tar xfv make-3.82.tar.gz<br />
cd make-3.82<br />
./configure<br />
make<br />
sudo apt purge -y make<br />
sudo ./make install<br />
cd ..<br />
# Please, download jdk-6u45-linux-x64.bin from https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html (requires free login)<br />
chmod +x jdk-6u45-linux-x64.bin <br />
./jdk-6u45-linux-x64.bin <br />
sudo mkdir /opt/java/<br />
sudo mv jdk1.6.0_45/ /opt/java/<br />
sudo update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.6.0_45/bin/javac 1<br />
sudo update-alternatives --install /usr/bin/java java /opt/java/jdk1.6.0_45/bin/java 1<br />
sudo update-alternatives --install /usr/bin/javaws javaws /opt/java/jdk1.6.0_45/bin/javaws 1<br />
sudo update-alternatives --config javac<br />
sudo update-alternatives --config java<br />
sudo update-alternatives --config javaws<br />
popd<br />
</pre><br />
'''Unpack SDK and then compile and pack the image'''<br />
<pre><br />
7z x 'PineCube Stock BSP-SDK ver1.0.7z'<br />
mv 'PineCube Stock BSP-SDK ver1.0' pinecube-sdk<br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
====== Using Vagrant ======<br />
<br />
You can avoid setting up your machine and just use Vagrant to spin up a development environment in a VM.<br />
<br />
Just clone the [https://github.com/elimo-engineering/pinecube-sdk-vagrant Elimo Engineering repo] and follow the instructions in the [https://github.com/elimo-engineering/pinecube-sdk-vagrant/blob/main/README.md readme file]<br />
<br />
After spinning up the VM, you just need to run the build:<br />
<pre><br />
cd pinecube-sdk/camdroid<br />
source build/envsetup.sh<br />
lunch<br />
mklichee<br />
make -j3<br />
pack<br />
</pre><br />
<br />
<br />
== Community Projects ==<br />
<br />
Share your project with a PineCube here!<br />
<br />
<br />
[[Category:PineCube]] [[Category:Allwinner(Sochip) S3]]</div>Newton688https://wiki.pine64.org/index.php?title=PineTime_FAQ&diff=8830PineTime FAQ2021-01-01T02:44:17Z<p>Newton688: </p>
<hr />
<div>=== Does PineTime run Linux? ===<br />
<br />
No. [https://forum.pine64.org/showthread.php?tid=8112 Please read this forum article] for information about Linux on PineTime. Also check out the article [https://lupyuen.github.io/pinetime-rust-mynewt/articles/pinetime "PineTime doesn't run Linux... But that's OK!"]<br />
<br />
=== Why are there two versions of the PineTime in the store? ===<br />
See the below question and answer<br />
<br />
=== Why can I only buy the closed version in a 3-pack, and the open version per one? ===<br />
<br />
TL:DR: The open PineTime is to develop on, The closed one only for production use, because of firmware uploads. That is why they are sold per 3. <br />
<br />
In the current situation in development there are some reasons to want to be sure you only experiment with an open device. If you install the wrong firmware, your device could be bricked, until you find a way to open it, which will likely damage the device.<br />
The idea is that if you want to develop an application for the PineTime, you will be testing it out first, and only after you know for sure your new firmware is well tested, you will install it on deployment devices. If you are in the deploy stage, having more than one PineTime is likely the point. So to prevent people from locking themselves out at the first test, it was decided to sell the closed version only as a pack of 3. Development can be done on an open device, so any issues can be easily handled.<br />
<br />
=== How long does it take to ship my PineTime? ===<br />
<br />
That depends on whether you chose for Standard or Express shipping. Standard shipping for the dev kit may take up to a few weeks.<br />
<br />
=== How do I install new software on PineTime? ===<br />
<br />
The nRF Connect mobile app (Android and iOS) may also be used to update the firmware on your PineTime. See [https://lupyuen.github.io/pinetime-rust-mynewt/articles/cloud#download-and-test-our-pinetime-firmware "Download and Test Our PineTime Firmware"]<br />
<br />
Also see [[Reprogramming_the_PineTime|this page to see various methods of reprogramming the devkit PineTime the wired way]].<br />
<br />
If you have a Sealed PineTime, flash only Certified PineTime Firmware to your PineTime. If you flash non-Certified PineTime Firmware, your Sealed PineTime may be bricked permanently.<br />
<br />
The only Certified PineTime Firmware available today is [https://github.com/JF002/Pinetime/releases/tag/0.8.2 InfiniTime 0.8.2]. Download the file "dfu-0.8.2.zip" under "Assets" and flash to PineTime with nRF Connect. Refer to the instructions here: [https://lupyuen.github.io/pinetime-rust-mynewt/articles/cloud#download-and-test-our-pinetime-firmware "Download and Test Our PineTime Firmware"]<br />
<br />
Remember to validate the firmware after flashing: Swipe up to show the menu, tap the Ticks icon, tap "Validate"<br />
<br />
=== My PineTime arrived, now what? ===<br />
<br />
You should start by testing out all the features of the watch, to make sure everything works. Power it on and check the display.<br />
<br />
PineTime is shipped with InfiniTime firmware. Press the watch button to show the clock, then swipe up on the touchscreen to reveal the menu.<br />
<br />
On your Android phone, install the nRF Connect mobile app to sync the date and time with PineTime. See [https://lupyuen.github.io/pinetime-rust-mynewt/articles/cloud#set-pinetime-date-and-time-with-nrf-connect "Set PineTime Date and Time with nRF Connect"] (nRF Connect on iOS can't be used for setting the date and time, because it doesn't implement the GATT Time Service)<br />
<br />
Download the latest Certified PineTime Firmware (see the previous question) and flash to PineTime with nRF Connect. Refer to the instructions here: [https://lupyuen.github.io/pinetime-rust-mynewt/articles/cloud#download-and-test-our-pinetime-firmware "Download and Test Our PineTime Firmware"]<br />
<br />
Remember to validate the firmware after flashing: Swipe up to show the menu, tap the Ticks icon, tap "Validate"<br />
<br />
=== What's the OS that's preinstalled on the PineTime by default? ===<br />
<br />
PineTime ships with the open source [https://github.com/JF002/Pinetime InfiniTime firmware].<br />
<br />
To support firmware update and rollback, PineTime includes the open source [https://lupyuen.github.io/pinetime-rust-mynewt/articles/mcuboot MCUBoot Bootloader].<br />
<br />
=== Can we use this OS or its source code? ===<br />
<br />
Yes, [https://github.com/JF002/Pinetime InfiniTime] and the [https://lupyuen.github.io/pinetime-rust-mynewt/articles/mcuboot MCUBoot Bootloader] are open source.<br />
<br />
=== Why is the back exposed? Is it supposed to snap on? ===<br />
<br />
The back cover of the PineTime dev kit is exposed so that you can flash and debug the device with the SWD pins. The main unit and cover does not snap (lock) together. If you want to attach the back cover anyway, you can use glue or tape.<br />
<br />
=== What hardware should I use to flash code to the PineTime? ===<br />
<br />
There are several ways you can do this, check out [[Reprogramming the PineTime]]<br />
<br />
=== How do I connect the PineTime to a programmer? ===<br />
<br />
Here's how: [[PineTime devkit wiring]]<br />
<br />
=== How do I set the time on PineTime? ===<br />
<br />
You can use either nRF Connect, custom GadgetBridge build or the proprietary Da Fit app. See [https://lupyuen.github.io/pinetime-rust-mynewt/articles/cloud#set-pinetime-date-and-time-with-nrf-connect "Set PineTime Date and Time with nRF Connect"]<br />
<br />
You can also set the time using your PinePhone or other Linux-based Bluetooth LE capable device with the Bluez software installed. Install the bluez package and make sure your PineTime is running and awake with InfiniTime 0.7.1 or later.<br />
<br />
$ bluetoothctl<br />
[ bluetooth ]# scan on<br />
...<br />
[NEW] Device D7:03:FB:6E:31:B2 Pinetime-JF<br />
...<br />
[bluetooth]# pair D7:03:FB:6E:31:B2<br />
Attempting to pair with D7:03:FB:6E:31:B2<br />
...<br />
[NEW] Characteristic (Handle 0xfd80)<br />
/org/bluez/hci0/dev_D7_03_FB_6E_31_B2/service0015/char0016<br />
00002a2b-0000-1000-8000-00805f9b34fb<br />
Current Time<br />
...<br />
[Pinetime-JF]# menu gatt<br />
...<br />
[Pinetime-JF]# select-attribute /org/bluez/hci0/dev_D7_03_FB_6E_31_B2/service0015/char0016<br />
[Pinetime-JF:/service0015/char0016]# read<br />
Attempting to read /org/bluez/hci0/dev_D7_03_FB_6E_31_B2/service0015/char0016<br />
[CHG] Attribute /org/bluez/hci0/dev_D7_03_FB_6E_31_B2/service0015/char0016 Value:<br />
b2 07 01 01 00 04 15 00 00 ......... <br />
b2 07 01 01 00 04 15 00 00 .........<br />
[Pinetime-JF:/service0015/char0016]# write "0xe4 0x07 0x0c 0x1f 0x0e 0x02 0x00 0x00 0x00"<br />
<br />
This is the format for the current time as hex bytes:<br />
<lsb of year> <msb of year> <month (1-12)> <day (1-31)> <hour (0-23)> <minute (0-59)> <seconds (0-59)> <weekday (1-7 where 1=Monday)> <fractions (1/256th of second)><br />
<br />
=== Is there a standard agreed method of pushing OTA updates so that one could seal the PineTime dev kit nicely? ===<br />
<br />
InfiniTime supports firmware updates over Bluetooth LE with the nRF Connect mobile app. See [https://lupyuen.github.io/pinetime-rust-mynewt/articles/cloud#download-and-test-our-pinetime-firmware "Download and Test Our PineTime Firmware"]<br />
<br />
=== My PineTime's screen shows garbage, how do I fix it? ===<br />
<br />
This is usually caused by unplugging the device after it has booted, it needs to be reinitialised. To do so just restart the watch by removing power to it.<br />
<br />
=== I have experience developing on Arduino. How does the PineTime compare? ===<br />
<br />
To learn programming on PineTime, [https://lupyuen.github.io/pinetime-rust-mynewt/articles/cloud check out this article]<br />
<br />
Arduino provides the Arduino IDE (or you use the avr-gcc and avrdude command-line tools) which you can use to compile and upload code to an Arduino board. The PineTime and its ARM processor doesn't have this, so you'll have to familiarize yourself with tools like GCC for ARM, and OpenOCD. Some experience with Arduino does translate over to the PineTime, especially if you've worked with LCD's, or SPI. The PineTime is at least four times faster than an Arduino Uno (even faster at certain specific workloads due to hardware acceleration), and it has 32 times more RAM and 16 times more flash storage.<br />
<br />
[https://github.com/lupyuen/ Lup Yuen Lee] (just call him Lup, rhymes with "Up") has written many articles on PineTime programming. [https://lupyuen.github.io/ Check out the articles here]<br />
<br />
=== Can I code firmware for PineTime without an actual PineTime? ===<br />
<br />
Yes, you may code PineTime Watch Faces and preview them in a web browser (thanks to WebAssembly)...<br />
<br />
[https://lupyuen.github.io/pinetime-rust-mynewt/articles/simulator PineTime Simulator]<br />
<br />
Then flash your firmware remotely to a real PineTime via Telegram, and watch your firmware run in a live video stream...<br />
<br />
[https://github.com/lupyuen/remote-pinetime-bot/blob/master/README.md Remote PineTime]<br />
<br />
=== What do I need for building PineTime firmware locally on my computer? ===<br />
<br />
Most flavours of PineTime firmware (InfiniTime, Hypnos, Klok, wasp-os) will build fine on Linux (x64, Arm32, Arm64) and macOS. Just follow the instructions provided.<br />
<br />
Download version 9-2020-q2-update of the [https://developer.arm.com/tools-and-software/open-source-software/developer-tools/gnu-toolchain/gnu-rm/downloads Arm Embedded Toolchain arm-none-eabi-gcc]. Other versions of gcc may have problems building the firmware correctly.<br />
<br />
On Windows, install [https://docs.microsoft.com/en-us/windows/wsl/about Windows Subsystem for Linux (WSL)] and execute the build steps inside the WSL Terminal (instead of the Windows Command Prompt). USB Programmers (like ST-Link and JLink) are not supported in WSL, so use the Windows Command Prompt to flash your built firmware to PineTime.<br />
<br />
[https://github.com/lupyuen/pinetime-rust-mynewt/blob/master/README.md pinetime-rust-mynewt] firmware for PineTime supports building and flashing via the Windows Command Prompt (no need for MinGW and Docker).<br />
<br />
=== Can I use Pinebook Pro for developing PineTime? ===<br />
<br />
Yes, use version 9-2020-q2-update of the [https://developer.arm.com/tools-and-software/open-source-software/developer-tools/gnu-toolchain/gnu-rm/downloads Arm Embedded Toolchain arm-none-eabi-gcc]. Other versions of gcc may have problems building the firmware correctly.<br />
<br />
=== What is ARM Semihosting? ===<br />
<br />
We use the SWD (Single Wire Debug) protocol created by ARM for flashing and debugging PineTime's nRF52832 microcontroller, which contains an ARM CPU. (SWD is derived from standard JTAG, but with fewer wires) With ARM CPUs you can trigger a software breakpoint, and allow the debugger (OpenOCD) to do something really nifty: Display a message, read console input, dump out a file, even read a file. That's called ARM Semihosting. [http://www.keil.com/support/man/docs/armcc/armcc_pge1358787046598.htm More about ARM Semihosting]<br />
<br />
=== What is OpenOCD? ===<br />
<br />
OpenOCD is Open On-Chip Debugger. It's the software that drives your microcontroller debugger/flasher. We need it for running any kind of flashing and debugging with Pi or ST-Link. gdb talks to OpenOCD for debugging firmware. gdb also works with VSCode for debugging firmware visually. [http://openocd.org/doc-release/html/About.html#What-is-OpenOCD_003f More about OpenOCD]<br />
<br />
Please use [https://xpack.github.io/openocd xPack OpenOCD] with PineTime. Other versions of OpenOCD seem to have problems with PineTime.<br />
<br />
=== How do I remove flash protection? ===<br />
<br />
PineTime watches shipped before 20 Sep 2020 have flash protection enabled.<br />
<br />
The flash protection can be removed using multiple different methods. If you don't have anything except the PineTime, not even a Raspberry Pi, then you have to order a programmer online: you can use a J-Link, CMSIS-DAP dongle and various other programmers. See [[Reprogramming_the_PineTime|this page to see various methods of reprogramming the PineTime]].<br />
<br />
If your PineTime was shipped after 20 Sep 2020, you don't need to remove flash protection. They are shipped with flash protection disabled. You can flash and debug PineTime right away with ST-Link, JLink and Raspberry Pi.<br />
<br />
=== Why can't you use ST-Link to remove nRF52 Flash Protection? ===<br />
<br />
Because ST-Link is a High Level Adapter. It doesn't really implement all SWD functions, just a subset (probably to keep the price low). More details in the section "Why Visual Studio Code with ST-Link (instead of nRFgo Studio with J-LINK)" in the article [https://medium.com/@ly.lee/coding-nrf52-with-rust-and-apache-mynewt-on-visual-studio-code-9521bcba6004?source=friends_link&sk=bb4e2523b922d0870259ab3fa696c7da "Coding nRF52 with Rust and Apache Mynewt on Visual Studio Code"].<br />
<br />
=== Since we need a low level SWD adapter like Raspberry Pi anyway, can we do everything on a Pi instead of ST-Link + Windows? ===<br />
<br />
Yes, Raspberry Pi works for flashing and debugging PineTime, even for removing flash protection. We have a special version of OpenOCD called OpenOCD SPI that talks to PineTime's SWD port over SPI (without bit-banging). See [https://github.com/lupyuen/pinetime-updater/blob/master/README.md PineTime Updater]<br />
<br />
=== Is there a 3D model of PineTime available somewhere? ===<br />
<br />
Not yet. Someone did design a cover you can snap on to keep the back shut. [https://www.thingiverse.com/thing:4172849 More details]<br />
<br />
=== Are there any alternatives to the wrist band provided with the PineTime? ===<br />
<br />
No, but PineTime accepts standard 20mm wrist band that is widely available by a third party.<br />
<br />
Note that some sellers have a different point of view on what standard is. So you should always check the fitting to make sure it looks like the one used by PineTime.<br />
<br />
=== I'm stuck. How can I get help? ===<br />
<br />
Chat with the PineTime Community on [[PineTime#Community|Matrix / Discord / Telegram / IRC]] (They are bridged into a single chatroom)<br />
<br />
[[Category:PineTime]]</div>Newton688https://wiki.pine64.org/index.php?title=PineTime_FAQ&diff=8823PineTime FAQ2020-12-31T19:49:34Z<p>Newton688: </p>
<hr />
<div>=== Does PineTime run Linux? ===<br />
<br />
No. [https://forum.pine64.org/showthread.php?tid=8112 Please read this forum article] for information about Linux on PineTime. Also check out the article [https://lupyuen.github.io/pinetime-rust-mynewt/articles/pinetime "PineTime doesn't run Linux... But that's OK!"]<br />
<br />
=== Why are there two versions of the PineTime in the store? ===<br />
See the below question and answer<br />
<br />
=== Why can I only buy the closed version in a 3-pack, and the open version per one? ===<br />
<br />
TL:DR: The open PineTime is to develop on, The closed one only for production use, because of firmware uploads. That is why they are sold per 3. <br />
<br />
In the current situation in development there are some reasons to want to be sure you only experiment with an open device. If you install the wrong firmware, your device could be bricked, until you find a way to open it, which will likely damage the device.<br />
The idea is that if you want to develop an application for the PineTime, you will be testing it out first, and only after you know for sure your new firmware is well tested, you will install it on deployment devices. If you are in the deploy stage, having more than one PineTime is likely the point. So to prevent people from locking themselves out at the first test, it was decided to sell the closed version only as a pack of 3. Development can be done on an open device, so any issues can be easily handled.<br />
<br />
=== How long does it take to ship my PineTime? ===<br />
<br />
That depends on whether you chose for Standard or Express shipping. Standard shipping for the dev kit may take up to a few weeks.<br />
<br />
=== How do I install new software on PineTime? ===<br />
<br />
The nRF Connect mobile app (Android and iOS) may also be used to update the firmware on your PineTime. See [https://lupyuen.github.io/pinetime-rust-mynewt/articles/cloud#download-and-test-our-pinetime-firmware "Download and Test Our PineTime Firmware"]<br />
<br />
Also see [[Reprogramming_the_PineTime|this page to see various methods of reprogramming the devkit PineTime the wired way]].<br />
<br />
If you have a Sealed PineTime, flash only Certified PineTime Firmware to your PineTime. If you flash non-Certified PineTime Firmware, your Sealed PineTime may be bricked permanently.<br />
<br />
The only Certified PineTime Firmware available today is [https://github.com/JF002/Pinetime/releases/tag/0.8.2 InfiniTime 0.8.2]. Download the file "dfu-0.8.2.zip" under "Assets" and flash to PineTime with nRF Connect. Refer to the instructions here: [https://lupyuen.github.io/pinetime-rust-mynewt/articles/cloud#download-and-test-our-pinetime-firmware "Download and Test Our PineTime Firmware"]<br />
<br />
Remember to validate the firmware after flashing: Swipe up to show the menu, tap the Ticks icon, tap "Validate"<br />
<br />
=== My PineTime arrived, now what? ===<br />
<br />
You should start by testing out all the features of the watch, to make sure everything works. Power it on and check the display.<br />
<br />
PineTime is shipped with InfiniTime firmware. Press the watch button to show the clock, then swipe up on the touchscreen to reveal the menu.<br />
<br />
On your Android phone, install the nRF Connect mobile app to sync the date and time with PineTime. See [https://lupyuen.github.io/pinetime-rust-mynewt/articles/cloud#set-pinetime-date-and-time-with-nrf-connect "Set PineTime Date and Time with nRF Connect"] (nRF Connect on iOS can't be used for setting the date and time, because it doesn't implement the GATT Time Service)<br />
<br />
Download the latest Certified PineTime Firmware (see the previous question) and flash to PineTime with nRF Connect. Refer to the instructions here: [https://lupyuen.github.io/pinetime-rust-mynewt/articles/cloud#download-and-test-our-pinetime-firmware "Download and Test Our PineTime Firmware"]<br />
<br />
Remember to validate the firmware after flashing: Swipe up to show the menu, tap the Ticks icon, tap "Validate"<br />
<br />
=== What's the OS that's preinstalled on the PineTime by default? ===<br />
<br />
PineTime ships with the open source [https://github.com/JF002/Pinetime InfiniTime firmware].<br />
<br />
To support firmware update and rollback, PineTime includes the open source [https://lupyuen.github.io/pinetime-rust-mynewt/articles/mcuboot MCUBoot Bootloader].<br />
<br />
=== Can we use this OS or its source code? ===<br />
<br />
Yes, [https://github.com/JF002/Pinetime InfiniTime] and the [https://lupyuen.github.io/pinetime-rust-mynewt/articles/mcuboot MCUBoot Bootloader] are open source.<br />
<br />
=== Why is the back exposed? Is it supposed to snap on? ===<br />
<br />
The back cover of the PineTime dev kit is exposed so that you can flash and debug the device with the SWD pins. The main unit and cover does not snap (lock) together. If you want to attach the back cover anyway, you can use glue or tape.<br />
<br />
=== What hardware should I use to flash code to the PineTime? ===<br />
<br />
There are several ways you can do this, check out [[Reprogramming the PineTime]]<br />
<br />
=== How do I connect the PineTime to a programmer? ===<br />
<br />
Here's how: [[PineTime devkit wiring]]<br />
<br />
=== How do I set the time on PineTime? ===<br />
<br />
You can use either nRF Connect, custom GadgetBridge build or the proprietary Da Fit app. See [https://lupyuen.github.io/pinetime-rust-mynewt/articles/cloud#set-pinetime-date-and-time-with-nrf-connect "Set PineTime Date and Time with nRF Connect"]<br />
<br />
You can also set the time using your PinePhone or other Linux-based Bluetooth LE capable device with the Bluez software installed. Install the bluez package and make sure your PineTime is running and awake with InfiniTime 0.7.1 or later.<br />
<br />
$ bluetoothctl<br />
[ bluetooth ]# scan on<br />
...<br />
[NEW] Device D7:03:FB:6E:31:B2 Pinetime-JF<br />
...<br />
[bluetooth]# pair D7:03:FB:6E:31:B2<br />
Attempting to pair with D7:03:FB:6E:31:B2<br />
...<br />
[NEW] Characteristic (Handle 0xfd80)<br />
/org/bluez/hci0/dev_D7_03_FB_6E_31_B2/service0015/char0016<br />
00002a2b-0000-1000-8000-00805f9b34fb<br />
Current Time<br />
...<br />
[Pinetime-JF]# menu gatt<br />
...<br />
[Pinetime-JF]# select-attribute /org/bluez/hci0/dev_D7_03_FB_6E_31_B2/service0015/char0016<br />
[Pinetime-JF:/service0015/char0016]# read<br />
Attempting to read /org/bluez/hci0/dev_D7_03_FB_6E_31_B2/service0015/char0016<br />
[CHG] Attribute /org/bluez/hci0/dev_D7_03_FB_6E_31_B2/service0015/char0016 Value:<br />
b2 07 01 01 00 04 15 00 00 ......... <br />
b2 07 01 01 00 04 15 00 00 .........<br />
[Pinetime-JF:/service0015/char0016]# write "0xe4 0x07 0x0c 0x1f 0x0e 0x02 0x00 0x00 0x00"<br />
<br />
This is the format for the current time as hex bytes:<br />
<lsb of year> <msb of year> <month (1-12)> <day (1-31)> <hour (0-23)> <minute (0-59)> <seconds (0-59)> <weekday (1-7 where 1=Monday)> <fractions (1/256th of second)>```<br />
<br />
=== Is there a standard agreed method of pushing OTA updates so that one could seal the PineTime dev kit nicely? ===<br />
<br />
InfiniTime supports firmware updates over Bluetooth LE with the nRF Connect mobile app. See [https://lupyuen.github.io/pinetime-rust-mynewt/articles/cloud#download-and-test-our-pinetime-firmware "Download and Test Our PineTime Firmware"]<br />
<br />
=== My PineTime's screen shows garbage, how do I fix it? ===<br />
<br />
This is usually caused by unplugging the device after it has booted, it needs to be reinitialised. To do so just restart the watch by removing power to it.<br />
<br />
=== I have experience developing on Arduino. How does the PineTime compare? ===<br />
<br />
To learn programming on PineTime, [https://lupyuen.github.io/pinetime-rust-mynewt/articles/cloud check out this article]<br />
<br />
Arduino provides the Arduino IDE (or you use the avr-gcc and avrdude command-line tools) which you can use to compile and upload code to an Arduino board. The PineTime and its ARM processor doesn't have this, so you'll have to familiarize yourself with tools like GCC for ARM, and OpenOCD. Some experience with Arduino does translate over to the PineTime, especially if you've worked with LCD's, or SPI. The PineTime is at least four times faster than an Arduino Uno (even faster at certain specific workloads due to hardware acceleration), and it has 32 times more RAM and 16 times more flash storage.<br />
<br />
[https://github.com/lupyuen/ Lup Yuen Lee] (just call him Lup, rhymes with "Up") has written many articles on PineTime programming. [https://lupyuen.github.io/ Check out the articles here]<br />
<br />
=== Can I code firmware for PineTime without an actual PineTime? ===<br />
<br />
Yes, you may code PineTime Watch Faces and preview them in a web browser (thanks to WebAssembly)...<br />
<br />
[https://lupyuen.github.io/pinetime-rust-mynewt/articles/simulator PineTime Simulator]<br />
<br />
Then flash your firmware remotely to a real PineTime via Telegram, and watch your firmware run in a live video stream...<br />
<br />
[https://github.com/lupyuen/remote-pinetime-bot/blob/master/README.md Remote PineTime]<br />
<br />
=== What do I need for building PineTime firmware locally on my computer? ===<br />
<br />
Most flavours of PineTime firmware (InfiniTime, Hypnos, Klok, wasp-os) will build fine on Linux (x64, Arm32, Arm64) and macOS. Just follow the instructions provided.<br />
<br />
Download version 9-2020-q2-update of the [https://developer.arm.com/tools-and-software/open-source-software/developer-tools/gnu-toolchain/gnu-rm/downloads Arm Embedded Toolchain arm-none-eabi-gcc]. Other versions of gcc may have problems building the firmware correctly.<br />
<br />
On Windows, install [https://docs.microsoft.com/en-us/windows/wsl/about Windows Subsystem for Linux (WSL)] and execute the build steps inside the WSL Terminal (instead of the Windows Command Prompt). USB Programmers (like ST-Link and JLink) are not supported in WSL, so use the Windows Command Prompt to flash your built firmware to PineTime.<br />
<br />
[https://github.com/lupyuen/pinetime-rust-mynewt/blob/master/README.md pinetime-rust-mynewt] firmware for PineTime supports building and flashing via the Windows Command Prompt (no need for MinGW and Docker).<br />
<br />
=== Can I use Pinebook Pro for developing PineTime? ===<br />
<br />
Yes, use version 9-2020-q2-update of the [https://developer.arm.com/tools-and-software/open-source-software/developer-tools/gnu-toolchain/gnu-rm/downloads Arm Embedded Toolchain arm-none-eabi-gcc]. Other versions of gcc may have problems building the firmware correctly.<br />
<br />
=== What is ARM Semihosting? ===<br />
<br />
We use the SWD (Single Wire Debug) protocol created by ARM for flashing and debugging PineTime's nRF52832 microcontroller, which contains an ARM CPU. (SWD is derived from standard JTAG, but with fewer wires) With ARM CPUs you can trigger a software breakpoint, and allow the debugger (OpenOCD) to do something really nifty: Display a message, read console input, dump out a file, even read a file. That's called ARM Semihosting. [http://www.keil.com/support/man/docs/armcc/armcc_pge1358787046598.htm More about ARM Semihosting]<br />
<br />
=== What is OpenOCD? ===<br />
<br />
OpenOCD is Open On-Chip Debugger. It's the software that drives your microcontroller debugger/flasher. We need it for running any kind of flashing and debugging with Pi or ST-Link. gdb talks to OpenOCD for debugging firmware. gdb also works with VSCode for debugging firmware visually. [http://openocd.org/doc-release/html/About.html#What-is-OpenOCD_003f More about OpenOCD]<br />
<br />
Please use [https://xpack.github.io/openocd xPack OpenOCD] with PineTime. Other versions of OpenOCD seem to have problems with PineTime.<br />
<br />
=== How do I remove flash protection? ===<br />
<br />
PineTime watches shipped before 20 Sep 2020 have flash protection enabled.<br />
<br />
The flash protection can be removed using multiple different methods. If you don't have anything except the PineTime, not even a Raspberry Pi, then you have to order a programmer online: you can use a J-Link, CMSIS-DAP dongle and various other programmers. See [[Reprogramming_the_PineTime|this page to see various methods of reprogramming the PineTime]].<br />
<br />
If your PineTime was shipped after 20 Sep 2020, you don't need to remove flash protection. They are shipped with flash protection disabled. You can flash and debug PineTime right away with ST-Link, JLink and Raspberry Pi.<br />
<br />
=== Why can't you use ST-Link to remove nRF52 Flash Protection? ===<br />
<br />
Because ST-Link is a High Level Adapter. It doesn't really implement all SWD functions, just a subset (probably to keep the price low). More details in the section "Why Visual Studio Code with ST-Link (instead of nRFgo Studio with J-LINK)" in the article [https://medium.com/@ly.lee/coding-nrf52-with-rust-and-apache-mynewt-on-visual-studio-code-9521bcba6004?source=friends_link&sk=bb4e2523b922d0870259ab3fa696c7da "Coding nRF52 with Rust and Apache Mynewt on Visual Studio Code"].<br />
<br />
=== Since we need a low level SWD adapter like Raspberry Pi anyway, can we do everything on a Pi instead of ST-Link + Windows? ===<br />
<br />
Yes, Raspberry Pi works for flashing and debugging PineTime, even for removing flash protection. We have a special version of OpenOCD called OpenOCD SPI that talks to PineTime's SWD port over SPI (without bit-banging). See [https://github.com/lupyuen/pinetime-updater/blob/master/README.md PineTime Updater]<br />
<br />
=== Is there a 3D model of PineTime available somewhere? ===<br />
<br />
Not yet. Someone did design a cover you can snap on to keep the back shut. [https://www.thingiverse.com/thing:4172849 More details]<br />
<br />
=== Are there any alternatives to the wrist band provided with the PineTime? ===<br />
<br />
No, but PineTime accepts standard 20mm wrist band that is widely available by a third party.<br />
<br />
Note that some sellers have a different point of view on what standard is. So you should always check the fitting to make sure it looks like the one used by PineTime.<br />
<br />
=== I'm stuck. How can I get help? ===<br />
<br />
Chat with the PineTime Community on [[PineTime#Community|Matrix / Discord / Telegram / IRC]] (They are bridged into a single chatroom)<br />
<br />
[[Category:PineTime]]</div>Newton688https://wiki.pine64.org/index.php?title=PineCube&diff=6747PineCube2020-08-25T18:26:58Z<p>Newton688: </p>
<hr />
<div>PAGE UNDER CONSTRUCTION, INFO SUBJECT TO CHANGE<br />
<br />
<br />
<br />
== Specifications ==<br />
* '''Dimensions:''' 55mm x 51mm x 51.5mm<br />
* '''Weight:''' 55g<br />
* '''Storage:'''<br />
** MicroSD slot, bootable<br />
** 128Mb SPI Nor Flash, bootable<br />
* '''Cameras:''' OV5640, 5Mpx <br />
* '''CPU:''' Allwinner(Sochip) ARM Cortex-A7 MPCore, 800MHz<br />
* '''RAM:''' 128MB DDR3<br />
* '''I/O:''' 10/100Mbps Ethernet with passive PoE, USB 2.0 A host, 26 pins GPIO port, internal mic<br />
* '''Network'''<br />
** WiFi<br />
* '''Screen:''' optional 4.5" RGB LCD screen<br />
* '''Battery:''' optional 1200mAh (1.2Ah)<br />
* '''Misc. features:''' <br />
** Volume and home buttons<br />
** Speakers and Microphone<br />
** DC in: 5V 1A from microUSB Port and GPIO port, 8V-24V from Ethernet passive PoE.<br />
<br />
<br />
== PineCube board information, schematics and certifications ==<br />
* PineCube mainboard schematic:<br />
** [http://files.pine64.org/doc/PineCube/PineCube%20MainBoard%20Schematic%20ver%201.0-20200727.pdf PineCube mainboard Released Schematic ver 1.0]<br />
* PineCube faceboard schematic:<br />
** [http://files.pine64.org/doc/PineCube/PineCube%20FaceBoard%20Schematic%20ver%201.0-20200727.pdf PineCube faceboard Released Schematic ver 1.0]<br />
* PineCube certifications:<br />
<br />
<br />
<br />
== Datasheets for components and peripherals ==<br />
* Allwinner (Sochip) S3 SoC information:<br />
** [http://files.pine64.org/doc/datasheet/pinecube/S3_Datasheet_V1.1-20180123.pdf Sochip S3 SoC Data Sheet V1.1]<br />
<br />
* X-Powers AXP209 PMU (Power Management Unit) information:<br />
** [http://files.pine64.org/doc/datasheet/pinecube/AXP209_Datasheet_v1.0en.pdf AXP209 PMIC datasheet]<br />
<br />
* CMOS camera module information:<br />
** [http://files.pine64.org/doc/datasheet/pinephone/OV5640_datasheet.pdf OV5640 5MP CMOS Image Sensor SoC datasheet]<br />
<br />
* LCD touch screen panel information:<br />
<br />
* Lithium battery information:<br />
<br />
* WiFi/BT module information:<br />
** [http://files.pine64.org/doc/datasheet/pinecube/rtl8189es.pdf RTL8189ES specification]<br />
<br />
<br />
<br />
== Operating Systems ==<br />
<br />
=== Stock Linux ===<br />
<br />
<br />
<br />
<br />
== SDK ==<br />
<br />
==== Stock Linux ====<br />
* [http://files.pine64.org/SDK/PineCube/PineCube%20Stock%20BSP-SDK%20ver1.0.7z Direct Download from pine64.org]<br />
** MD5 (7zip file): efac108dc98efa0a1f5e77660ba375f8<br />
** File Size: 3.50GB</div>Newton688https://wiki.pine64.org/index.php?title=Pinebook_Pro&diff=5953Pinebook Pro2020-06-19T20:27:53Z<p>Newton688: </p>
<hr />
<div>= User Guide =<br />
== Introducing PineBook Pro == <br />
[[File:PBP.jpg|400px|thumb|right|Pinebook Pro running stock Debian with MATE]]<br />
<br />
The Pinebook Pro is a Linux and *BSD ARM laptop from [https://www.pine64.org/ PINE64]<br />
<br />
It is built to be a compelling alternative to mid-ranged Chromebooks that people convert into Linux laptops. It features an IPS 1080p 14″ LCD panel, a premium magnesium alloy shell, high capacity eMMC storage, a 10,000 mAh capacity battery, and the modularity that only an open source project can deliver. <br />
<br />
Key features include: the RK3399 SOC; USB-C for data, video-out and power-in (3A 5V); privacy switches for the microphone, BT/WiFi module, and camera; and expandable storage via NVMe (PCIe 4x) with an optional adapter. <br />
<br />
The Pinebook Pro is equipped with 4GB LPDDR4 system memory, high capacity eMMC flash storage, and 128Mb SPI boot Flash. The I/O includes: 1x micro SD card reader (bootable), 1x USB 2.0, 1x USB 3.0, 1x USB type C Host with DP 1.2 and power-in, PCIe 4x for an NVMe SSD drive (requires an optional adapter), and UART (via the headphone jack by setting an internal switch). <br />
<br />
The keyboard and trackpad both use the USB 2.0 protocol. The LCD panel uses eDP MiPi display protocol.<br />
<br />
Many different Operating Systems (OS) are freely available from the open source community and partner projects. These include various flavors of Linux (Ubuntu, Debian, Manjaro, etc.) and *BSD. <br />
<br />
== Software and OS Image Downloads ==<br />
<br />
=== Default Manjaro KDE Desktop Quick Start ===<br />
<br />
When you first get your Pinebook Pro and boot it up for the first time, it'll come with Manjaro using the KDE desktop. The Pinebook Pro is officially supported by the Manjaro ARM project, and support can be found on the [https://forum.manjaro.org/c/manjaro-arm/78 Manjaro ARM forums.]<br />
<br />
On first boot, it will ask for certain information such as your timezone location, keyboard layout, username, password, and hostname. Most of these should be self-explanatory. Note that the hostname it asks for should be thought of as the "codename" of your machine, and if you don't know what it's about, you can make something up.<br />
<br />
After you're on the desktop, be sure to update it as soon as possible and reboot after updates are finished installing. If nothing appears when you click on the Networking icon in your system tray to connect to your Wi-Fi, ensure the Wi-Fi [https://wiki.pine64.org/index.php/Pinebook_Pro#ANSI_Fn_.2B_F_keys_wrong_for_F9.2C_F10.2C_F11_and_F12 privacy switch] is not disabled.<br />
<br />
=== [[Pinebook Pro_Software_Release|Pinebook Pro images]] ===<br />
Under [[Pinebook Pro Software Release|'Pinebook Pro Software and OS Image Download Section']] you will find a complete list of currently supported Operating System images that work with the Pinebook as well as other related software. <br />
<br />
The list includes OS images and descriptions of:<br />
<br />
[{{fullurl:PinebookPro_Software_Release#Manjaro_ARM}} [[File:Manjaro.png|125px]]] [[PinebookPro_Software_Release#Manjaro ARM|'''Manjaro ARM (microSD and eMMC Boot)''']]<br />
<br />
[{{fullurl:PinebookPro_Software_Release#Debian_Desktop}} [[File:Debian.png|125px]]] [[PinebookPro_Software_Release#Debian Desktop|'''Debian Desktop (microSD and eMMC Boot)''']]<br />
<br />
[{{fullurl:PinebookPro_Software_Release#Bionic_LXDE}} [[File:Lxde.png|125px]]] [[PinebookPro_Software_Release#Bionic LXDE|'''Bionic LXDE (microSD and eMMC Boot)''']]<br />
<br />
[{{fullurl:PinebookPro_Software_Release#Bionic_Mate}} [[File:Mate.png|125px]]] [[PinebookPro_Software_Release#Bionic Mate|'''Bionic Mate (microSD and eMMC Boot)''']]<br />
<br />
[{{fullurl:PinebookPro_Software_Release#Fedora}} [[File:Fedora1.png|125px]]] [[PinebookPro_Software_Release#Fedora|'''Fedora (microSD and eMMC Boot)''']]<br />
<br />
[{{fullurl:PinebookPro_Software_Release#OpenSUSE}} [[File:Opensuse1.png|125px]]] [[PinebookPro_Software_Release#OpenSUSE|'''OpenSUSE (microSD and eMMC Boot)''']]<br />
<br />
[{{fullurl:PinebookPro_Software_Release#Q4OS}} [[File:Q4os.png|125px]]] [[PinebookPro_Software_Release#Q4OS|'''Q4OS (microSD and eMMC Boot)''']]<br />
<br />
[{{fullurl:PinebookPro_Software_Release#Armbian}} [[File:Armbian.png|125px]]] [[PinebookPro_Software_Release#Armbian|'''Armbian (microSD and eMMC Boot)''']]<br />
<br />
[{{fullurl:PinebookPro_Software_Release#NetBSD}} [[File:Netbsd.png|125px]]] [[PinebookPro_Software_Release#NetBSD|'''NetBSD (microSD and eMMC Boot)''']]<br />
<br />
[{{fullurl:Pinebook_Pro_Software_Release#OpenBSD}} [[File:Puffy_mascot_openbsd.png|125px]]] [[Pinebook_Pro_Software_Release#OpenBSD|'''OpenBSD release for ARM64''']]<br />
<br />
[{{fullurl:PinebookPro_Software_Release#Chromium}} [[File:Chromium.jpg|125px]]] [[PinebookPro_Software_Release#Chromium|'''Chromium (microSD and eMMC Boot)''']]<br />
<br />
[{{fullurl:PinebookPro_Software_Release#Android_7.1_microSD}} [[File:Android_7.png|125px]]] [[PinebookPro_Software_Release#Android_7.1_microSD|'''Android 7.1 (microSD Boot)''']] &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; [{{fullurl:PinebookPro_Software_Release#Android_7.1_eMMC}} [[File:Android_7.png|125px]]] [[PinebookPro_Software_Release#Android_7.1_eMMC|'''Android 7.1 (eMMC)''']]<br />
<br />
[{{fullurl:PinebookPro_Software_Release#Daniel_Thompson.27s_Debian_Installer_for_the_Pinebook_Pro}} [[File:Debian.png|125px]]] [[PinebookPro_Software_Release#Daniel_Thompson.27s_Debian_Installer_for_the_Pinebook_Pro|'''Debian Installer for Pinebook Pro''']]<br />
<br />
[{{fullurl:Pinebook_Pro_Software_Release#Gentoo_Script_for_Pinebook_Pro}} [[File:Gentoo.png|125px]]]<br />
[[Pinebook_Pro_Software_Release#Gentoo_Script_for_Pinebook_Pro|'''Gentoo Script for Pinebook Pro''']]<br />
<br />
[{{fullurl:PinebookPro_Software_Release#Kali_Linux_for_Pinebook_Pro}} [[File:Kali.jpeg|125px]]] [[PinebookPro_Software_Release#Kali_Linux_for_Pinebook_Pro|'''Kali Script for Pinebook Pro (microSD and eMMC Boot)''']]<br />
<br />
=== Quick Links to OS Images Build Sources===<br />
'''Some of the provided OS images are still in <span style="color:#FF0000">beta or nightly build</span> and only fit for testing purposes. These images ought to be avoided for normal usage - use them at <span style="color:#FF0000">your own risk</span>'''<br />
* [https://github.com/ayufan-rock64/linux-build/releases/ ayufan's Linux build repo] (Includes Ubuntu 20.04 Focal Fossa and Debian Buster images. Click 'Assets' at the end of the releases text to view images) <br />
* [https://github.com/ayufan-rock64/chromiumos-build/releases ayufan's Chromium OS build repo]<br />
* [https://github.com/mrfixit2001/debian_desktop/releases mrfixit2001's Linux debian desktop build repo]<br />
<br />
== Keyboard ==<br />
The Pinebook Pro is available in two keyboard configurations: ISO and ANSI. Both the keyboard and trackpad in the Pinebook Pro use the USB 2.0 protocol and show up as such in xinput. The keyboard features function (Fn) keys in the F-key row, which include display brightness controls, sound volume, trackpad lock, and other functionality. There is also a custom PINE64 logo key that functions as Menu/Super key. It has also a secondary functionality for setting the privacy switches. <br />
<br />
The keyboard firmware binary can be flashed from userspace using the provided open source utility. <br />
<br />
Documentation for the keyboard can be found in [[#Datasheets for Components and Peripherals|Datasheets for Components and Peripherals]]. <br />
<br />
=== Typing special characters ===<br />
The [[Wikipedia:British_and_American_keyboards#Other_keyboard_layouts|UK ISO Layout]] does not have dedicated keys for characters like the German umlauts (Ä,Ö,Ü, etc). Certain characters can still be generated by means of either key combinations or key sequences. <br />
{| class="wikitable"<br />
!Character<br />
!Key combination/sequence<br />
|-<br />
|Ä, Ö, Ü, ä, ö, ü<br />
|[[Wikipedia:AltGr_key|[AltGr]]]+'[' followed by [A], [O], [U], [a], [o] or [u]<br />
|-<br />
|µ<br />
|[AltGr]+[m]<br />
|-<br />
|Ø, ø<br />
|[AltGr]+[O], [AltGr]+[o]<br />
|-<br />
|@<br />
|[AltGr]+[q] (as on the German layout)<br />
|-<br />
|ß<br />
|[AltGr]+[s]<br />
|-<br />
|§<br />
|[AltGr]+[S]<br />
|-<br />
|°<br />
|[AltGr]+[)]<br />
|}<br />
<br />
=== Privacy Switches ===<br />
There are three privacy switches mapped to the F10, F11 and F12 keys on the Pinebook Pro keyboard. They de/activate the following:<br />
<br />
{| class="wikitable"<br />
|+ Privacy switch function and description<br />
! Combination<br />
! Effect<br />
! Description<br />
! Notes<br />
|-<br />
! scope=row | PINE64 logo key+F10<br />
| Microphone Privacy switch || CAPs lock LED blinks. 2 blinks = enabled, 3 blinks = disabled<br />
|-<br />
! scope=row | PINE64 logo key+F11<br />
| WiFi Privacy switch || NUM lock LED blinks. 2 blinks = WiFi enabled / killswitch disabled, 3 blinks = WiFi disabled / killswitch enabled.<br />
| '''Re-enabling requires reboot''' (or a [//forum.pine64.org/showthread.php?tid=8313&pid=52645#pid52645 command line hack to bind/unbind]).<br />
|-<br />
! scope=row | PINE64 logo key+F12<br />
| Camera privacy switch || CAPs lock and NUM lock LEDs blink together. 2 blinks = enabled, 3 blinks = disabled<br />
|}<br />
<br />
'''(Press the PINE64 logo key plus F10/F11/F12) for 3 seconds)'''<br />
<br />
The keyboard operates on firmware independant of the operating system. It detects if one of the F10, F11 or F12 keys is pressed in combination with the Pine key for 3 seconds. Doing so disables power to the appropriate peripheral, thereby disabling it. This has the same effect as cutting off the power to each peripheral with a physical switch. This implementation is very secure, since the firmware that determines whether a peripheral gets power is not part of the Pinebook Pro’s operating system. So the power state value for each peripheral cannot be overridden or accessed from the operating system. The power state setting for each peripheral is stored across reboots inside the keyboard's firmware flash memory.<br />
<br />
== Trackpad ==<br />
The trackpad is a reasonable size, has a matte finish that that your finger can slide along easily, and two actuating buttons. It is the only component of the Pinebook Pro held in place with strong adhesive tape. It supports multi-touch functionality. <br />
Documentation for the trackpad can be found in [[#Datasheets for Components and Peripherals|Datasheets for Components and Peripherals]].<br />
The trackpad firmware binary can be flashed from userspace using the provided open source utility (https://github.com/ayufan-rock64/pinebook-pro-keyboard-updater). A fork with more recent changes is also available (https://github.com/jackhumbert/pinebook-pro-keyboard-updater) and may want to be considered instead while feature parity is different.<br />
<br />
<br />
'''Everyone with a Pinebook Pro produced in 2019 should update their keyboard and trackpad firmware.''' <br />
<br />
Before you start:<br />
<br />
Please refer to original documentation for details.<br />
<br />
Your Pinebook Pro should be either fully charged or, preferably, running of mains. This utility will be writing chips on the keyboard and trackpad, so a loss of power during any stage of the update can result in irrecoverable damage to your trackpad or keyboard.<br />
<br />
The scripts ought to work on all OSs available for the Pinebook Pro. Some OSs may, however, require installation of relevant dependencies. The instructions below assume a Debian desktop, newer Pinebook Pro models that come with Manjaro will require a different command to install the proper dependencies.<br />
<br />
There are two keyboard versions of the Pinebook Pro, ISO and ANSI. You need to know which model you have prior to running the updater. <br />
FW update steps for both models are listed below. <br />
<br />
What you will need:<br />
<br />
*Your Pinebook Pro fully charged or running off of mains power<br />
*Connection to WiFi<br />
*An external USB keyboard & mouse. Or access to the Pinebebook Pro via SSH<br />
<br />
'''ISO Model''' <br />
<br />
From the terminal command line: <br />
<br />
<pre><br />
git clone https://github.com/ayufan-rock64/pinebook-pro-keyboard-updater<br />
cd pinebook-pro-keyboard-updater<br />
sudo apt-get install build-essential libusb-1.0-0-dev xxd<br />
make<br />
</pre><br />
<br />
Step 1<br />
<pre><br />
cd pinebook-pro-keyboard-updater<br />
sudo ./updater step-1 iso<br />
sudo reboot<br />
</pre><br />
<br />
Step 2 (after reboot)<br />
<pre><br />
cd pinebook-pro-keyboard-updater<br />
sudo ./updater step-2 iso<br />
sudo reboot<br />
</pre><br />
<br />
----<br />
<br />
'''ANSI Model''' <br />
<br />
*<b>NOTE:</b> Running step-1 on the ansi keyboard model will make the keyboard and trackpad inaccessible until step-2 is run, so an external keyboard must be connected to complete the update on this model!<br />
<br />
From the terminal command line: <br />
<br />
<pre><br />
git clone https://github.com/ayufan-rock64/pinebook-pro-keyboard-updater<br />
cd pinebook-pro-keyboard-updater<br />
sudo apt-get install build-essential libusb-1.0-0-dev xxd<br />
make<br />
</pre><br />
<br />
Step 1<br />
<pre><br />
cd pinebook-pro-keyboard-updater<br />
sudo ./updater step-1 ansi<br />
sudo reboot<br />
</pre><br />
<br />
Step 2 (after reboot)<br />
<pre><br />
cd pinebook-pro-keyboard-updater<br />
sudo ./updater step-2 ansi<br />
sudo reboot<br />
</pre><br />
When done, if some of the keys produce in-correct characters, please check your OSes' language settings. For ANSI users, the default OS shipped with English UK as the default language. You can change it to English US if desired.<br />
<br />
=== X-Windows & trackpad settings ===<br />
Some forum members have found that an adjustment to X-Windows will allow finer motion in the trackpad. If you use the '''Synaptic''' mouse/trackpad driver use this command to make the change live;<br />
<pre>synclient MinSpeed=0.25</pre><br />
You may experiment with different settings, but 0.25 was tested as helping noticably.<br><br />
<br><br />
To make the change persist across reboots, change the file <code>/etc/X11/xorg.conf</code> similar to below;<br />
<pre> Section "InputClass"<br />
Identifier "touchpad catchall"<br />
Driver "synaptics"<br />
MatchIsTouchpad "on"<br />
MatchDevicePath "/dev/input/event*"<br />
Option "MinSpeed" "0.25"<br />
EndSection</pre><br />
The line <code>Option "MinSpeed" "0.25"</code> is the change.<br><br />
<br><br />
Another forum user built on the above settings a little, and have found these to be very good:<br />
<pre>synclient MinSpeed=0.25<br />
synclient FingerLow=30<br />
synclient PalmDetect=1<br />
synclient VertScrollDelta=64<br />
synclient HorizScrollDelta=64</pre><br />
<br />
<code>FingerLow</code> has the same value as 'FingerHigh' in one config (30). It is believed to help reduce mouse movement as you lift your finger, but it's not sure if synaptic works like this.<br />
You may find this config to be comfortable for daily use.<br />
<br><br />
<br />
The <code>right mouse click</code> is emulated by tapping with two fingers on the trackpad. If you feel that this is not very responsive you can try this value:<br />
<pre> synclient MaxTapTime=250 </pre>.<br />
<br><br />
<br />
== Power Supply ==<br />
* Input Power: 5V DC @ 3A<br />
* Mechanical: 3.5mm OD / 1.35mm ID, Barrel jack<br />
* USB-C 5V, 15W PD quickcharge<br />
* Only use one power input at a time, barrel jack OR USB-C<br />
<br />
== LEDs ==<br />
In total there are four LEDs on the Pinebook Pro, three of which are placed in the top left side of the keyboard, and one near the barrel-port: <br />
<br />
:1. The red LED next to the barrel-port indicates charging. It will illuminate when mains power is supplied to the Pinebook Pro from either the standard power supply unit or a USB-C smartphone charger.<br />
<br />
:2. The power indicator LED on the Pinebook Pro supports three different colours: green, amber and red. It is also capable of flashing/blinking to indicate activity. In the default Debian with MATE build, green LED means power and red means suspend (amber is unused). <br />
<br />
:3. The Num lock, green LED.<br />
<br />
:4. The Caps lock, green LED.<br />
<br />
(The Num and Caps lock LEDs have a secondary function. When the privacy switches get activated they blink to confirm that switch has been activated.)<br />
<br />
== Webcam ==<br />
* Streaming video resolutions supported, (un-compressed):<br />
** 320 x 240 <br />
** 640 x 480<br />
** 800 x 600<br />
** 1280 x 720<br />
** 1600 x 1200<br />
* Still frame resolutions supported:<br />
** 160 x 120<br />
** 176 x 144<br />
** 320 x 240<br />
** 352 x 288 <br />
** 640 x 480<br />
** 800 x 600<br />
** 1280 x 720<br />
** 1600 x 1200<br />
* Some people test with the application Cheese<br />
WIP<br />
<br />
== Microphones ==<br />
While it has been said that some Pinebook Pro units contain only one microphone despite having two labeled microphone holes on the outer casing, other units do indeed contain two microphones. It is presently unclear which batches have either configuration; units from the initial community batch of 1000 units (following the initial 100) are believed to contain two, populating both labeled holes.<br />
<br />
The wires leading to both microphones connect to the mainboard with a small white plastic connector, located directly adjacent to the ribbon cable attachment point for the keyboard interface.<br />
<br />
<br />
<br />
'''Microphones not working?'''<br />
<br />
If pavucontrol input doesn't show microphone activity try the [[Pinebook_Pro#Privacy_Switches]]; once that is set to on do the below; if that still hasn't fixed it you may want to check that the microphone connector is plugged in (see the [[Pinebook_Pro#Technical_Reference]]).<br />
<br />
<pre><br />
run alsamixer from the command line > hit F6 and select the es8316 > hit F4 to get to the capture screen > select the bar labeled ADC ><br />
> increase the gain to 0dB > change the audio profile in pavucontrol to another with input<br />
<br />
Additionally:<br />
you may want to modify ADC PGA to get the levels to where you want them<br />
</pre><br />
<br />
== Bluetooth and WiFi ==<br />
[[File:PinebookPro_WirelessIC_Location.jpg|400px|thumb|right|The Pinebook Pro's AP6256 wireless module]]<br />
===Hardware Overview===<br />
The Pinebook Pro contains an AMPAK AP6256 wireless module to provide Wi-Fi (compliant to IEEE 802.11ac) and Bluetooth (compliant to Bluetooth SIG revision 5.0). The module contains a Broadcom transceiver IC, believed to be the BCM43456, as well as the support electronics needed to allow the Wi-Fi and Bluetooth modes to share a single antenna. <br />
<br />
The wireless module interfaces with the Pinebook Pro’s system-on-chip using a combination of three interfaces: Bluetooth functionality is operated by serial UART and PCM, while the Wi-Fi component uses SDIO. It is unknown if the module’s Bluetooth capabilites are usable under operating systems that do not support SDIO.<br />
<br />
The module’s RF antenna pin is exposed on the mainboard via a standard Hirose U.FL connector, where a coaxial feedline links it to a flexible adhesive antenna situated near the upper right corner of the Pinebook Pro’s battery. As the RF connector is fragile and easily damaged, it should be handled carefully during connection and disconnection, and should not be reconnected frequently.<br />
<br />
===Issues===<br />
Problems have been reported with the Wi-Fi transceiver’s reliability during extended periods of high throughput, especially on the 2.4 GHz band. While the cause of this has yet to be determined, switching to the 5 GHz band may improve stability.<br />
<br />
Since the Bluetooth transceiver shares both its spectrum and antenna with 2.4 GHz Wi-Fi, simultaneous use of these modes may cause interference, especially when listening to audio over Bluetooth. If Bluetooth audio cuts out frequently, switching to the 5 GHz band – or deactivating Wi-Fi – may help.<br />
<br />
===Wi-Fi Capabilities===<br />
Wi-Fi on the Pinebook Pro is capable of reaching a maximum data transfer rate of approximately 433 megabits per second, using one spatial stream. The transceiver does not support multiple spatial streams or 160-MHz channel bandwidths.<br />
<br />
The Wi-Fi transceiver supports the lower thirteen standard channels on the 2.4 GHz band, using a bandwidth of 20 MHz. At least twenty-four channels are supported on the 5 GHz band, spanning frequencies from 5180 to 5320 MHz, 5500 to 5720 MHz, and 5745 to 5825 MHz, with bandwidths of 20, 40, or 80 MHz.<br />
<br />
Maximum reception sensitivity for both bands is approximately -92 dBm. The receiver can tolerate input intensities of no more than -20 dBm on the 2.4 GHz band, and no more than -30 dBm on the 5 GHz band. Maximum transmission power is approximately +15 dBm for either band, falling further to approximately +10 dBm at higher data transfer rates on the 5 GHz band.<br />
<br />
With current available drivers and firmware, the Wi-Fi interface supports infrastructure, ad-hoc, and access-point modes with satisfactory reliability. Monitor mode is not presently supported. Wi-Fi Direct features may be available, but it is unclear how to make use of them under Linux.<br />
<br />
Be aware that Linux userspace utilities, such as <code>iw</code>, may report inaccurate information about the capabilities of wireless devices. Parameter values derived from vendor datasheets, or direct testing, should be preferred to the outputs of hardware-querying tools.<br />
<br />
===Bluetooth Capabilities===<br />
Bluetooth data transfer speeds have an indicated maximum of 3 megabits per second, but it is unclear what practical data rates can be expected. Audio streaming over Bluetooth is functioning normally, as is networking. Bluetooth Low-Energy functions, such as interacting with Bluetooth beacons, have not yet been tested conclusively.<br />
<br />
The Bluetooth transceiver supports all 79 channel allocations, spanning frequencies from 2402 MHz to 2480 MHz. Reception sensitivity is approximately -85 dBm, with a maximum tolerable reception intensity of -20 dBm. Bluetooth transmission power is limited to +10 dBm.<br />
<br />
===Disabling Bluetooth===<br />
<pre><br />
#disable bluetooth once<br />
sudo rfkill block bluetooth && <br />
<br />
#confirm<br />
rfkill<br />
</pre><br />
<br />
<pre><br />
#disable bluetooth on boot**<br />
sudo systemctl enable rfkill-block@bluetooth<br />
</pre><br />
<br />
<nowiki>**This does not do what one might want on certain distros, Manjaro XFCE for example. Try the below.</nowiki><br />
<br />
<pre><br />
right click on the bluetooth panel icon > select 'plugins' > PowerManager > configuration > deselect the auto power on option<br />
</pre><br />
<br />
== LCD Panel ==<br />
* Model: BOE NV140FHM-N49<br />
* 14.0" (35.56cm) diagonal size<br />
* 1920x1080 resolution<br />
* 60hz refresh rate<br />
* IPS<br />
* 1000:1 contrast<br />
* 250nt brightness<br />
* 63% sRGB coverage<br />
* 6 bit colour<br />
* 30 pin eDP connection<br />
<br />
Some people have tested hardware video decode using the following;<br />
<pre>ffmpeg -benchmark -c:v h264_rkmpp -i file.mp4 -f null -</pre><br />
<br />
== External ports list ==<br />
Here are a list of the external ports. See [[Pinebook_Pro#Expansion_Ports|Technical Reference - Expansion Ports]] for port specifications.<br />
* Left side<br />
** Barrel jack for power, (with LED)<br />
** USB 3, Type A<br />
** USB 3, Type C<br />
* Right side<br />
** USB 2, Type A<br />
** Standard headset jack<br />
** MicroSD card slot<br />
<br />
== Using the UART ==<br />
[[File:PBPUART.jpeg|400px|thumb|right|Headphone jack UART wiring reference.<br />
<br> Swapping the tx and rx around from this also works and is more traditional.<br />
<br> See [http://files.pine64.org/doc/pinebook/guide/Pinebook_Earphone_Serial_Console_Developer_Guide.pdf this] official Pine64 .pdf.]]<br />
<br />
UART output is enabled by flipping the UART switch to the ON position (item 9). To do so you need to remove the Pinebook Pro's bottom cover - please follow [https://wiki.pine64.org/index.php/Pinebook_Pro_Main_Page#Accessing_the_Internals_-_Disassembly_and_Reassembly proper disassembly and reassembly protocol]. The OFF position is towards the touchpad, the ON position is towards the display hinges.<br />
<br />
With the UART switch in the ON position, console is relayed via the audiojack and the laptop's sound is turned OFF. Please ensure that you are using a 3.3v interface (such as the CH340, FTDI-232R, or PL2303, which are sold in both 3.3v and 5v variants) to avoid damage to the CPU. <br />
<br />
Insert the USB plug of the cable into an open USB port on the machine which will monitor. Run the following in a terminal:<br />
<br />
<code><br />
$ lsusb<br />
</code><br />
<br />
you should find a line similar to this:<br />
<br />
<code><br />
Bus 001 Device 058: ID 1a86:7523 QinHeng Electronics HL-340 USB-Serial adapter<br />
</code><br />
<br />
You may have to clean the USB contacts of the Serial cable to get a good connection if you do not find that line.<br />
<br />
The audio jack of the Serial cable should be fully inserted into the Pinebook Pro audio port.<br />
<br />
Serial output should now be accessible using screen, picocom or minicom (and others).<br />
Examples:<br />
<br />
<code><br />
screen /dev/ttyUSB0 1500000<br />
<br />
picocom /dev/ttyUSB0 -b 1500000<br />
<br />
minicom -D /dev/ttyUSB0 -b 1500000</code><br />
<br />
Old versions of U-Boot do not use the UART for console output. <strike>The console function is activated by the Linux kernel. Thus, if you use a non-Pinebook Pro Linux distro and want the UART as a console, you have to manually enable it.</strike><br />
<br />
== Using the optional NVMe adapter ==<br />
The optional NVMe adapter allows the use of M.2 cards that support the NVMe standard, (but not SATA standard). The optional NVMe M.2 adapter supports '''M''' & '''M'''+'''B''' keyed devices, in both 2242 & 2280 physical sizes, the most common ones available. In addition, 2230 & 2260 are also supported, though NVMe devices that use those sizes are rare.<br />
<br />
Once you have fitted and tested your NVMe drive, please add a note to this page [[Pinebook_Pro_Hardware_Accessory_Compatibility]] to help build a list of tried and tested devices.<br />
<br />
=== Installing the adapter ===<br />
The V2.1-2019-0809 SSD adapter that shipped with the initial Pinebook Pro batches had significant issues. A repair kit will be shipped to address those issues.<br />
(If necessary, it can be modified to work. There is [https://forum.pine64.org/showthread.php?tid=8322&pid=52700#pid52700 an unofficial tutorial on the forums] describing these modifications.)<br />
<br />
The updated SSD adapter, labeled V2-2019-1107, takes into account the prior problems with trackpad interference. New orders as of Feb. 22nd, 2020 will be the updated adapter.<br />
<br />
Actual installation instructions are a work in progress.<br />
<br />
=== Post NVMe install power limiting ===<br />
Some NVMe SSDs allow reducing the maximum amount of power. Doing so may reduce the speed, but it may be needed in the Pinebook Pro to both improve reliability at lower battery levels. And to reduce power used, to maintain battery life.<br />
Here are the commands to obtain and change the power settings. The package 'nvme-cli' is required to run these commands. The example shows how to find the available power states, and then sets it to the lowest, non-standby setting, (which is 3.8 watts for the device shown);<br />
<pre>$ sudo nvme id-ctrl /dev/nvme0<br />
NVME Identify Controller:<br />
...<br />
ps 0 : mp:9.00W operational enlat:0 exlat:0 rrt:0 rrl:0<br />
rwt:0 rwl:0 idle_power:- active_power:-<br />
ps 1 : mp:4.60W operational enlat:0 exlat:0 rrt:1 rrl:1<br />
rwt:1 rwl:1 idle_power:- active_power:-<br />
ps 2 : mp:3.80W operational enlat:0 exlat:0 rrt:2 rrl:2<br />
rwt:2 rwl:2 idle_power:- active_power:-<br />
ps 3 : mp:0.0450W non-operational enlat:2000 exlat:2000 rrt:3 rrl:3<br />
rwt:3 rwl:3 idle_power:- active_power:-<br />
ps 4 : mp:0.0040W non-operational enlat:6000 exlat:8000 rrt:4 rrl:4<br />
rwt:4 rwl:4 idle_power:- active_power:-<br />
<br />
$ sudo nvme get-feature /dev/nvme0 -f 2<br />
get-feature:0x2 (Power Management), Current value:00000000<br />
$ sudo nvme set-feature /dev/nvme0 -f 2 -v 2 -s<br />
set-feature:02 (Power Management), value:0x000002</pre><br />
Some NVMe SSDs don't appear to allow saving the setting with "-s" option. In those cases, leave off the "-s" and use a startup script to set the non-default power state at boot.<br><br />
If you want to test performance without saving the new power setting semi-permanantly, then leave off the "-s" option.<br/><br />
<br/><br />
There is another power saving feature for NVMes, APST, (Autonomous Power State Transitions). This performs the power saving & transitions based on usage. To check if you have a NVMe SSD with this feature;<br />
<pre>$ sudo nvme get-feature -f 0x0c -H /dev/nvme0</pre><br />
Information for this feature, (on a Pinebook Pro), is a work in progress.<br />
<br />
=== Using as data drive ===<br />
As long as the kernel in use has both the PCIe and NVMe drivers, you should be able to use a NVMe drive as a data drive. It can automatically mount when booting from either the eMMC or an SD card. This applies to Linux, FreeBSD, and Chromium, using the normal partitioning and file system creation tools. Android requires testing.<br />
<br />
=== Using as OS root drive ===<br />
The SoC does not include the NVMe boot code, so the NVMe is not in the SoC's boot order. However, using the [https://github.com/mrfixit2001/updates_repo/blob/v1.1/pinebook/filesystem/mrfixit_update.sh U-Boot update script] from the mrfixit2001 Debian or [https://pastebin.com/raw/EeK074XB Arglebargle's modified script], and [https://github.com/pcm720/rockchip-u-boot/releases the modified u-boot images] provided by forum user pcm720, you can now add support to boot from an NVMe drive. Binary images are useable with SD, eMMC, and [[Pinebook_Pro_SPI|SPI flash]]. For OS images using the mainline kernel, there are a few variants of U-Boot available that also support NVMe as the OS drive. Though these may require writing the U-Boot to the SPI flash for proper use of the NVMe as the OS drive.<br />
<br />
The current boot order, per last testing, for this modified U-Boot is:<br />
*MicroSD<br />
*eMMC<br />
*NVMe<br />
<br />
For more information, please refer to [https://forum.pine64.org/showthread.php?tid=8439&pid=53764#pid53764 the forum post.]<br />
<br />
It is also possible to initially boot off an eMMC or SD card, then transfer to a root file system on the NVMe. Currently, it is necessary to have the U-Boot code on an eMMC or SD card. (A forum member [https://forum.pine64.org/showthread.php?tid=8439 posted here] about using a modified version of U-Boot with NVMe drivers, that uses <code>/boot</code> and <code>/</code> off the NVMe drive. So this may change in the future.)<br />
<br />
Please see [[Pinebook_Pro#Bootable Storage|Bootable Storage]].<br />
<br />
== Caring for the PineBook Pro ==<br />
=== Bypass Cables ===<br />
The mainboard features two (disconnected by default) bypass cables that are only to be used with the battery disconnected. The female (10) male (6) ends of the bypass cables can be connected to provide power to the mainboard if you need to run the laptop without a battery. Please refer to this [http://files.pine64.org/doc/PinebookPro/PinebookPro_Engineering_Notice.pdf engineering notice]. <br />
<br />
'''Note that despite the bypass cable being a two conductor cable, it is only used as one. Both wires being soldered together on either side is normal!'''<br />
<br />
WARNING: Do not connect the bypass cables with the battery connected. Using the bypass cables with the battery connected can permanently damage the computer.<br />
<br />
=== [[Pinebook_Service_Step_by_Step_Guides|Pinebook Service Step-by-Step Guides]] ===<br />
<span style="color:#FF0000">Placeholder for Pinebook Pro specific guides</span><br />
<br />
Under [[Pinebook_Service_Step_by_Step_Guides|'Service Guides for Pinebook']] you can find instructions guides concerning disassembly of:<br />
<br />
'''Note: The installation process on Pinebook Pro similar to 14" Pinebook'''<br />
<br />
'''Note: The installation process is the reverse order of removal guide'''<br />
<br />
* 14″ Pinebook Lithium Battery Pack Removal Guide<br />
* 14″ Pinebook LCD Panel Screen Removal Guide<br />
* 14″ Pinebook eMMC Module Removal Guide<br />
<br />
== Using the SPI flash device ==<br />
The Pinebook Pro comes with a 128Mbit, (16MByte), flash device suitable for initial boot target, to store the bootloader. The SoC used on the Pinebook Pro boots from this SPI flash device first, before eMMC or SD card. At present, April 19, 2020, the Pinebook Pros ship without anything programmed in the SPI flash device. So the SoC moves on to the next potential boot device, the eMMC. ARM/ARM64 computers do not have a standardized BIOS, yet.<br />
<br />
Here is some information on using the SPI flash device:<br />
<br />
* You need the kernel built with SPI flash device support, which will supply a device similar to:<br><br><code>/dev/mtd0</code><br><br><br />
* The Linux package below, will need to be available:<br><br><code>mtd-utils</code><br><br><br />
* You can then use this program from the package to write the SPI device:<br><br><code>flashcp &lt;filename&gt; /dev/mtd0</code><br><br><br />
<br />
Even if you need to recover from a defective bootloader written to the SPI flash, you can simply short pin 6 of the SPI flash to GND and boot. This will render the SoC bootrom unable to read from the SPI flash and have it fall back to reading the bootloader from other boot media like the eMMC or Micro SD card.<br />
<br />
The procedures described above are a lot less risky than attaching an external SPI flasher and do not require any additional hardware.<br><br />
<br><br />
At present, April 19th, 2020, there is no good bootloader image to flash into the SPI flash device. This is expected to change, as there are people working on issue.<br />
<br />
== FAQ ==<br />
What cool software works out of the box? [[Pinebook Pro OTB Experience]]<br />
<br />
= Software tuning guide =<br />
Details on how to get the most out of a Pinebook Pro & its RK3399 SoC.<br />
<br />
== Customizing the Pinebook Pro's default Manjaro KDE system ==<br />
=== Watching DRM content (Netflix, etc.) ===<br />
Most paid online streaming services use Widevine DRM to make their content more difficult to pirate. Widevine is not directly supported on Manjaro KDE, however it is still possible to watch DRM content via the "chromium-docker" package which downloads a 32-bit ARM container and installs Chromium with Widevine inside of that. While not space-efficient, or efficient in general, it's the recommended solution for watching this content on your Pinebook Pro. You can install this package with:<br />
<pre>sudo pacman -Sy chromium-docker</pre><br />
=== Checking GPU capabilities ===<br />
To see what versions of OpenGL and OpenGL ES are supported by the Pinebook Pro, what driver is in use, and what version of the driver is loaded, install the "mesa-demos" package with:<br />
<pre>sudo pacman -Sy mesa-demos</pre><br />
And then run:<br />
<pre>glxinfo | grep OpenGL</pre><br />
This will give detailed information about your graphics card and driver, useful for debugging.<br />
<br />
=== Better GPU compatibility and performance ===<br />
For better graphics performance, you may install the "mesa-git" package, built and supplied in the Manjaro ARM repos. This lets you bring in the latest features, optimizations, and bugfixes for the graphics driver used by the Pinebook Pro. Installation is as simple as:<br />
<pre>pacman -Sy mesa-git</pre><br />
Then you may reboot to load the newer driver.<br />
<br />
=== OpenGL ES 3.0 support ===<br />
By default, with the current state of the Panfrost GPU driver, the Pinebook Pro supports OpenGL 2.1 and OpenGL ES 2.0. If you want to use OpenGL ES 3.0, you need to set the system-wide environment variable, open the '''/etc/environment''' file with:<br />
<pre>kate /etc/environment</pre><br />
And then at the bottom of the file, on a new line, add:<br />
<pre>PAN_MESA_DEBUG="gles3"</pre><br />
Then save the file, entering your password when prompted, and reboot the system. When you check your GPU capabilities, it should report OpenGL ES 3.0 and applications that rely on it should function properly. Note that GLES3 support is currently incomplete and some advanced rendering features may still not work properly.<br />
== Customizing the Pinebook Pro's previously-default Debian system ==<br />
Here are some hints on what you can do to customize the Pinebook Pro's previous factory image (aka [https://github.com/mrfixit2001/debian_desktop mrfixit2001 debian build])<br />
<br />
=== Initial user changes, password, name, etc ===<br />
When you first get your Pinebook Pro, you should consider setting strong passwords and making the default account your own.<br />
<br />
* Reboot (this is just to ensure all background processes belong to the user are not running... there are other ways to achieve this but this way is easy)<br />
* Once the machine reboots press Alt-Ctrl-F1 to bring up a text terminal<br />
* Login as root (login: root, password: root)<br />
* Set a strong password for the root user using the following command and it's prompts:<br />
<pre># passwd (and follow prompts)</pre><br />
* Rename the rock user to your prefered username (replace myself with whatever you like):<br />
<pre># usermod -l myself -d /home/myself -m rock</pre><br />
* Rename the rock group to match your preferred username:<br />
<pre># groupmod -n myself rock</pre><br />
* Put your name in the account, (replace "John A Doe" with your name):<br />
<pre># chfn -f "John A Doe" myself</pre><br />
* Set a strong password for the normal user:<br />
<pre># passwd myself</pre><br />
* Log out of the text terminal:<br />
<pre># logout</pre><br />
* Press Alt-Ctrl-F7 to go back to the login screen and then login as the normal user<br />
* Open text terminal to fix login error: "Configured directory for incoming files does not exist";<br />
<pre>$ blueman-services</pre><br />
Select "Transfer" tab and set "Incoming Folder" to myself<br />
OR<br />
If adduser is in distro, this is MUCH easier<br />
sudo adduser $USER ,, fill out requested data<br />
Then,, sudo adduser $USER $GROUP,,, 1 group at a time<br />
To see which groups to add,,, id $USER, id rock<br />
<br />
=== Changing the default hostname ===<br />
Debian 9 has a command to allow you to change the hostname. You can see the current settings using;<br />
<pre>> sudo hostnamectl<br />
Static hostname: Debian-Desktop<br />
Icon name: computer<br />
Machine ID: dccbddccbdccbdccbdccbdccbdccbccb<br />
Boot ID: ea99ea99ea99ea99ea99ea99ea99ea99<br />
Operating System: Debian GNU/Linux 9 (stretch)<br />
Kernel: Linux 4.4.210<br />
Architecture: arm64</pre><br />
To change, use this, (with "My_Hostname" used as the example);<br />
<pre>> sudo hostnamectl set-hostname My_Hostname</pre><br />
Whence done, you can re-verify using the first example.<br />
<br />
Then you should backup and edit your <code>/etc/hosts</code> entry's name;<br />
<pre>> sudo cp -p /etc/hosts /etc/hosts.`date +%Y%m%d`<br />
> sudo vi /etc/hosts<br />
127.0.0.1 localhost<br />
127.0.0.1 My_Hostname<br />
::1 localhost ip6-localhost ip6-loopback<br />
fe00::0 ip6-localnet<br />
ff00::0 ip6-mcastprefix<br />
ff02::1 ip6-allnodes<br />
ff02::2 ip6-allrouters<br />
127.0.1.1 linaro-alip</pre><br />
<br />
=== Disable Chromium browser's prompt for passphrase & password storage ===<br />
<br />
Perform the following steps:<br />
<br />
* On the tool bar, hover over the Chromium icon<br />
* Using the right mouse button, select '''Properties'''<br />
* In the '''Command:''' line section, add <code>--password-store=basic</code> before the <code>%U</code><br />
* Use the '''x Close''' button to save the change<br />
This will of course, use basic password storage, meaning any saved passwords are not encrypted. Perfectly fine if you never use password storage.<br />
<br />
=== Changing the boot splash picture ===<br />
<br />
The default boot splash picture can be replaced using the following instructions:<br />
<br />
* Install '''ImageMagick''' which will do the conversion<br />
<pre>$ sudo apt-get install imagemagick</pre><br />
* Create a 1920 x 1080 picture. For the best results, use a PNG image (It supports lossless compression).<br />
* From the directory in which your new image is stored run the following commands<br />
* Convert your image to the bootsplash raw format using imagemagick convert.<br />
<pre>$ convert yoursplashimage.png -separate +channel -swap 0,2 -combine -colorspace sRGB RGBO:splash.fb</pre><br />
* Create a backup copy of your current splash screen<br />
<pre>$ sudo cp /usr/share/backgrounds/splash.fb /usr/share/backgrounds/splash_original.fb</pre><br />
* Copy your new splash screen into place<br />
<pre>$ sudo cp splash.fb /usr/share/backgrounds/splash.fb</pre><br />
* Set the correct permissions on the splash.fb file<br />
<pre>$ sudo chmod 644 /usr/share/backgrounds/splash.fb</pre><br />
* If you do not want to see kernel console text messages, make sure you don't have '''Plymouth''' installed<br />
<br />
=== Watching Amazon Prime videos with Chromium ===<br />
When you create a new user, it will be necessary to launch the Chromium browswer with a specific user agent like below;<br />
<pre>chromium-browser --user-agent="Mozilla/5.0 (X11; CrOS armv7l 6946.63.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36"</pre><br />
There may be more tweaks needed.<br />
<br />
=== Enabling text boot time messages ===<br />
<br />
By default, most Linux distros have a boot screen with a picture. To see all the boot time messages, use one of the following;<br />
<br />
<b><u>Default Debian</u></b><br />
* Backup and edit the U-Boot configuration file:<br />
<pre>cp -p /etc/default/u-boot /etc/default/u-boot.`date +%Y%m%d`<br />
chmod a-w /etc/default/u-boot.`date +%Y%m%d`<br />
vi /etc/default/u-boot</pre><br />
Remove the '''quiet''' and '''splash''' parameters. Leave everything else alone.<br />
* Update the U-Boot configuration:<br />
<pre>u-boot-update</pre><br />
* Test and verify you get what you think you should be seeing.<br />
<br><br />
<b><u>Manjaro</u></b><br />
* Backup and edit the U-Boot configuration file:<br />
<pre>cp -p /boot/extlinux/extlinux.conf /boot/extlinux/extlinux.conf.`date +%Y%m%d`<br />
chmod a-w /boot/extlinux/extlinux.conf.`date +%Y%m%d`<br />
vi /boot/extlinux/extlinux.conf</pre><br />
Remove the '''bootsplash.bootfile''' option and it's parameter. Leave everything else alone.<br><br />
You can add verbose logging by appending '''ignore_loglevel''' to the line where boot splash was.<br />
* Test and verify you get what you think you should be seeing.<br />
<br />
== Improving readability ==<br />
<br />
Some people find that a 14" LCD screen with 1080p, (1920 x 1080), has text and icons a bit too small. There are things you can do to make the screen easier to use & read.<br><br />
* Increase the font size<br />
* Use a font with more pronounce features<br />
* Increase the various window manager sizes, (like increase the height of the tool bar)<br />
* Change the color scheme to be easier on the eyes, (even though it may not be bigger, proper contrast can help usability)<br />
* Change the window manager's decorations, (like using larger icons)<br />
* Use a workspace manager, (so one application per workspace)<br />
* When at home or office, use an external monitor<br />
* Changing the X-Window's DPI, (Dot's Per Inch), can help too<br />
<br><br />
However, do not change the resolution of the LCD screen, otherwise you may end up with a blank / black screen. If that happens, see this trouble shooting section for the fix:<br><br />
[[Pinebook_Pro#After_changing_builtin_LCD_resolution.2C_blank_screen|Blank screen after changing builtin LCD resolution]]<br />
<br />
== Chromium tweaks ==<br />
<br />
=== Flags ===<br />
<br />
From the [https://github.com/mrfixit2001/updates_repo/blob/v1.8/pinebook/filesystem/default official Debian image]:<br />
<br />
<pre><br />
--disable-low-res-tiling \<br />
--num-raster-threads=6 \<br />
--profiler-timing=0 \<br />
--disable-composited-antialiasing \<br />
--test-type \<br />
--show-component-extension-options \<br />
--ignore-gpu-blacklist \<br />
--use-gl=egl \<br />
--ppapi-flash-path=/usr/lib/chromium-browser/pepper/libpepflashplayer.so \<br />
--ppapi-flash-version=32.0.0.255 \<br />
--enable-pinch \<br />
--flag-switches-begin \<br />
--enable-gpu-rasterization \<br />
--enable-oop-rasterization \<br />
--flag-switches-end<br />
</pre><br />
<br />
Note that in some cases, this may also decrease performance substantially, as observed when using these flags on the Manjaro KDE desktop. Feel free to experiment to find what is smoothest for you personally.<br />
<br />
== GVIM has performance issue ==<br />
It appears that using GTK3 can cause very slow scolling, while VIM in a terminal window works fine.<br/><br />
Simply revert back to using GTK2, (how to do so, is somewhat Linux distro specific).<br />
<br />
== Kernel options ==<br />
Here are some Pinebook Pro & its RK3399 SoC Linux specific options. If kernel version, (or version range specific), it should list that information in the description.<br />
<br />
To see if a specific feature is enabled in the current kernel, you can use something like this;<br />
<br />
<pre><br />
$ zgrep -i rockchip_pcie /proc/config.gz<br />
# CONFIG_ROCKCHIP_PCIE_DMA_OBJ is not set<br />
CONFIG_PHY_ROCKCHIP_PCIE=m<br />
</pre><br />
If it's listed as <code>=m</code>, then it's a module. You can see if the module is loaded with;<br />
<pre><br />
$ lsmod | grep -i rockchip_pcie<br />
phy_rockchip_pcie 16384 0<br />
</pre><br />
Note modules are not loaded until needed. Thus, we sometimes check the kernel configuration instead to see if a feature is configured first, then see if it's a module.<br />
<br />
=== Hardware video decoding ===<br />
Here is a method to check for hardware video decoding by the VPU. There are special Linux kernel modules that perform this function.<br><br />
Older systems, such as the previously-default Debian desktop, use the Rockchip-supplied kernel module <code>rk-vcodec</code>. To check, something like this can be used:<br />
<pre><br />
$ lsmod | grep rk-vcodec<br />
or<br />
$ zgrep RK_VCODEC /proc/config.gz<br />
CONFIG_RK_VCODEC=y<br />
</pre><br />
Note that in the above example, the Rockchip video CODEC is not built as a module, but included into the kernel. Thus, it does not show up in the list modules check.<br><br />
<br><br />
<br />
Newer systems may use a different option as in the configuration below:<br />
<pre><br />
$ zgrep HANTRO /proc/config.gz<br />
CONFIG_VIDEO_HANTRO=m<br />
CONFIG_VIDEO_HANTRO_ROCKCHIP=y<br />
</pre><br />
&nbsp;<br><br />
<br />
= Troubleshooting guide =<br />
Tips, tricks and other information for troubleshooting your Pinebook Pro<br />
=== New from the factory - Pinebook Pro won't boot / power on ===<br />
* Some Pinebook Pros came from the factory with the eMMC switch in the disabled position. It should be switched towards the back / hinge to enable the eMMC.<br><br />
* The eMMC may have come loose during shipment. [https://wiki.pine64.org/index.php/Pinebook_Pro_Main_Page#Accessing_the_Internals_-_Disassembly_and_Reassembly Open] the back and verify that the eMMC is firmly seated.<br><br />
* You may want to try unplugging the SD card daughterboard ribbon cable and see if it powers on (remove the battery and peel off a bit of the tape before unplugging it to avoid damage). If it does, try reseating it on both sides. It might have come loose during shipping.<br />
* It's possible that your eMMC is empty from the factory. Simply create a bootable SD card and see if your Pinebook Pro boots. If so, you can then write an OS image to the eMMC.<br />
<br />
=== Pinebook Pro will not power on after toggling the eMMC enable/disable switch ===<br />
* This may happen if you meant to toggle the UART/Headphone switch (9) towards touchpad for headphone use and instead you toggled the eMMC enable/disable switch (24).<br />
* After reenabling eMMC by toggling switch (24) towards hinge, if Pinebook Pro does not turn on then press the RESET button (28). It is clearly marked 'reset' on the PCB board.<br />
<br />
=== Pinebook Pro will not power on after removing and replacing EMI shielding ===<br />
* Closely inspect that the shielding is firmly seated in the clips on all sides. You can be seated in the clips on one axis, and have missed on an another axis.<br />
<br />
=== Pinebook Pro won't boot when using UART console cable ===<br />
* If you're using the UART cable sold on the Pine Store, you may want to see if it boots after you disconnect it. Some users report that custom-made cables based on FTDI UART adapters do not cause this issue.<br />
* Make sure your USB to serial UART device is 3.3v. Many are 5v and some even +-12v. Pinebook Pro's only support 3.3v and may act eratically when using higher voltage. Further, higher voltage could permananetly damage the Pinebook Pro's SoC.<br />
<br />
=== WiFi issues ===<br />
* First, check the privacy switches to make sure your WiFi is enabled. They are persistant. See [[Pinebook_Pro_Main_Page#Privacy_Switches|Privacy Switches]]<br />
* Next, you may have to modify the <code>/etc/NetworkManager/NetworkManager.conf</code> as root user, and replace <code>managed=false</code> with <code>managed=true</code>. Then reboot.<br />
* For connections that drop and resume too often, it might be due to WiFi power management from earlier OS releases. Later OS releases either removed WiFi power management, or default to full power. (Power management can be turned off via command line with <code>iw dev wlan0 set power_save off</code> or <code>iwconfig wlan0 power off</code>, although it is not persistent through re-boot.)<br />
* For connections that drop under load on the default Debian, remove <code>iwconfig wlan0 power off</code> in the file <code>/etc/rc.local</code>.<br />
* If WiFi is un-usable or often crashes when using an alternate OS, then it might because its WiFi firmware is not appropriate for the WiFi chip in the Pinebook Pro. Try the latest firmware patch from [https://gitlab.manjaro.org/tsys/pinebook-firmware/tree/master/brcm https://gitlab.manjaro.org/tsys/pinebook-firmware/tree/master/brcm]<br />
* After re-enabling WiFi via the privacy switch, you have to reboot to restore function. There is a work around for the default Debian, (and may work with others);<br />
&nbsp; &nbsp; &nbsp; &nbsp; <code>sudo tee /sys/bus/platform/drivers/dwmmc_rockchip/{un,}bind <<< 'fe310000.dwmmc'</code><br />
<br />
=== Bluetooth issues ===<br />
* When connecting a Bluetooth device, such as a Bluetooth mouse, it does not automatically re-connect on re-boot. In the Bluetooth connection GUI, there is a yellow star for re-connect on boot. Use that button to enable a persistent connection. It can be changed back later.<br />
* Bluetooth-attached speakers or headset require the <b>pulseaudio-module-bluetooth</b> package. If not already installed, it can be installed with a package manager or with:<br><br />
<pre>sudo apt-get install pulseaudio-module-bluetooth</pre><br />
* When using Bluetooth-attached speakers or headset and 2.4Ghz WiFi at the same time, you may experience stuttering of the audio. One solution is to use 5Ghz WiFi if you can. Or you may try using a different 2.4Ghz channel, perhaps channel 1 or the top channel, (11 in the USA, or 13/14 in some other countries).<br />
<br />
=== Sound issues ===<br />
* Many reports of no sound are due to the OS, incorrect settings, or other software problems (eg. PulseAudio). So first test to see if it is a software or hardware problem, by trying another OS via SD card. (For example, if Debian is installed on the eMMC, try Ubuntu on SD.) <br />
* If you cannot get sound from the headphone jack, but can get sound from the speakers, then the headphone / UART console switch may be set to the UART mode. You can open the back and check the position of the switch. If set to UART mode, switch it to headphone mode. See the parts layout for the location and correct position of the switch.<br />
* When using the USB C alternate DisplayPort mode, it is possible that the audio has been re-directed through this path. If your monitor has speakers, see if they work.<br />
* See [https://gitlab.manjaro.org/manjaro-arm/packages/community/pinebookpro-post-install/blob/master/asound.state manjaro-arm/pinebookpro-post-install /var/lib/alsa/asound.state] for some ALSA tweaks.<br />
* See [https://gitlab.manjaro.org/manjaro-arm/packages/community/pinebookpro-audio manjaro-arm/pinebookpro-audio] for how to handle 3.5mm jack plug/unplug events with ACPID.<br />
* Serveral users have reported that one internal speaker had reversed polarity. Thus, sound from the speakers is like an echo effect.<br />
** Their is a software fix using alsamixer and then enable either "R invert" or "L invert", however, now the headphones have incorrect audio.<br />
** The permanent fix is to re-wire one speaker, though this requires soldering small wires.<br />
<br />
=== USB docks & USB C alternate mode video ===<br />
The Pinebook Pro uses the RK3399 SoC (System on a Chip). It supports a video pass through mode on the USB C port using DisplayPort alternate mode. This DisplayPort output comes from the same GPU used to display the built-in LCD. <br />
<br />
Here are some selection criteria for successfully using the USB C alternate mode for video:<br />
* The device must use USB C alternate mode DisplayPort. Not USB C alternate mode HDMI, or other.<br />
* The device can have a HDMI, DVI, or VGA connector, if it uses an active translater.<br />
* If USB 3 is also desired from a USB dock, the maximum resolution, frame rate and pixel depth is reduced to half the bandwidth. For example, 4K @ 30hz instead of 60hz.<br />
* USB docks that also use USB C alternate mode DisplayPort will always have USB 2 available, (480Mbps, half-duplex).<br />
<br />
=== Keys not registering / missing keys when typing ===<br />
This issue occurs when your thumb or edge of the palm makes contact with left or right tip of the trackpad when you type. This is due to the palm rejection firmware being too forceful. Instead of only disabling the trackpad, so your cursor does not move all over the screen, it disables both the trackpad and the keyboard.<br />
<br />
Using Fn+F7 to disable the touchpad will keep it from also disabling the keyboard.<br />
<br />
A [[Pinebook_Pro#Trackpad|firmware update]] has been released to address this.<br />
<br />
=== Key Mapping ===<br />
* See this [https://gitlab.manjaro.org/manjaro-arm/packages/community/pinebookpro-post-install/blob/master/10-usb-kbd.hwdb /etc/udev/hwdb.d/10-usb-kbd.hwdb] for some key mapping tweaks<br />
<br />
=== Pinebook Pro gets stuck after first reboot in Trackpad Firmware Update ===<br />
This refers to the firmware update shown here:<br />
https://github.com/ayufan-rock64/pinebook-pro-keyboard-updater#update-all-firmwares<br />
* If the system is not responding after the 1st reboot, it's might be easiest to do a system restore, and follow up by running the second step of the trackpad firmware update. <br />
* System restore https://forum.pine64.org/showthread.php?tid=8229<br />
* Firmware update https://github.com/ayufan-rock64/pinebook-pro-keyboard-updater#update-all-firmwares<br />
<br />
=== ANSI Fn + F keys wrong for F9, F10, F11 and F12 ===<br />
There appears to be a minor firmware issue for ANSI keyboard models of the Pinebook Pro. Some discussion and fixes have been proposed;<br />
<br />
* Discussion thread [https://forum.pine64.org/showthread.php?tid=8744&pid=57678#pid57678 Fn + F keys screwy for F9, F10, F11 and F12]<br />
* Proposed fix [https://github.com/ayufan-rock64/pinebook-pro-keyboard-updater/issues/14#issuecomment-576825396 (ANSI) Fn + F(9-12) has wrong assignment after firmware update #14]<br />
&nbsp;<br><br />
<br />
=== After changing builtin LCD resolution, blank screen ===<br />
Some people find that the text or icons are too small, so they attempt to change the resolution of the built-in display. Afterwards, the display is blank.<br><br />
Use the following to fix when logged into a text console as yourself, (Control-Alt-F1 through F6). After listing the resolutions, select the native resolution, (1920x1080 aka 1080p). <br />
<pre>export DISPLAY=:0.0<br />
xrandr -q<br />
xrandr -s [resolution]</pre><br />
Once your resolution is restored, try using the Tweak app to set scaling, instead.<br><br />
<br><br />
If the above fix did not work, you can try this:<br />
* Using a text console, (Control-Alt-F1), login with your normal user ID<br />
* Edit the file <code>nano ~/.config/monitors.xml</code><br />
* Change the "width" value to "1920"<br />
* Change the "height" value to "1080"<br />
* If there is more than one monitor configuration listed, edit that one too.<br>Be careful to make no other changes. If needed, exit without saving and re-edit.<br />
* Save the file and exit.<br />
* Login using the GUI and test<br />
* If you are still loggied in via the GUI, you will have to reboot using <code>sudo shutdown -r now</code><br>After the reboot, you should be able to login to the GUI login and have the resolution back to normal.<br />
When you have restored usability to your Pinebook Pro's graphical screen, see this section on improving readability and usability:<br><br />
[[Pinebook_Pro#Improving_readability|Improving readability]]<br />
<br />
=== Cracks in the plastic ===<br />
There have been multiple reports of cracks in the plastic keyboard & trackpad part of the case. These are generally near here;<br />
* Hinges<br />
* USB ports<br />
* Top side, around the corners<br />
This seems to apply to the first batches in 2019. Later versions of the keyboard & trackpad have used better plastic. With replacements now in the Pine64 Store, it's possible to simply order a replacement.<br><br />
<br><br />
There have been a few reports of cracks in the plastic around the LCD display, but these appear to be less common. There are replacement LCDs with bezel available in the Pine64 Store.<br />
<br />
= [[Pinebook_Pro_Hardware_Accessory_Compatibility|PineBookPro Hardware Compatibility]] =<br />
<br />
Please contribute to the hardware compatibility page, which lists hardware which has been tested with the PBP, whether successful or not!<br />
<br />
* [[Pinebook_Pro_Hardware_Accessory_Compatibility#NVMe_SSD_drives|NVMe SSD drives]]<br />
* [[Pinebook_Pro_Hardware_Accessory_Compatibility#USB_hardware|USB hardware]]<br />
* [[Pinebook_Pro_Hardware_Accessory_Compatibility#USB_C_alternate_mode_DP|USB C alternate mode DP]]<br />
* [[Pinebook_Pro_Hardware_Accessory_Compatibility#Other_hardware|Other hardware]]<br />
&nbsp;<br/><br />
&nbsp;<br />
<br />
= Technical Reference =<br />
== Accessing the Internals - Disassembly and Reassembly == <br />
[[File:Standoffs.png|400px|thumb|right|Pinebook Screw stand-offs correct placement and location]]<br />
<br />
'''WARNING:''' Do not open the laptop by lifting the lid while the Pinebook Pro bottom cover is removed - this can cause structural damage to the hinges and/or other plastic components of the chassis such as the IO port cut-outs.<br />
<br />
'''WARNING:''' When removing the back cover plate, use care if sliding fingertips between back cover plate and palm rest assembly. The back cover plate edges are sharp.<br />
<br />
When disassembling the laptop make sure that it is powered off and folded closed. To remove the bottom cover of the Pinebook Pro, first remove the ten (10) Phillips head screws that hold the bottom section of the laptop in place. Remove the cover from the back where the hinges are situated by lifting it up and away from the rest of the chassis.<br />
<br />
During reassembly, make sure that the back-screw standoffs are in place and seated correctly. To reassemble the Pinebook Pro, slide the bottom section into place so it meets the front lip of the keyboard section. Secure the front section (where the trackpad is located) in place using the short screws in the front left and right corners. Then proceed to pop in the bottom panel into place. Secure the bottom section (where hinges are located) by screwing in the left and right corners. Then screw in the remaining screws and run your finger though the rim on the chassis to make sure its fitted correctly. Note that the front uses the remaining 2 short screws.<br />
<br />
NOTE: The screws are small and should only be finger tight. Too much force will strip the threads. If after installing screws the back cover plate has not seated properly on one side, open the display and hold the base on either side of the keyboard and gently flex the base with both hands in opposing directions. Once the side pops further in, then recheck the screws on that side. If it does not pop back in, just let it be.<br />
<br />
NOTE: a basic 3d model to print replacement back-screw standoffs is available on Thingiverse [https://www.thingiverse.com/thing:4226648] pending release of something more definitive<br />
<br />
== Pinebook Pro Internal Layout ==<br />
=== Main chips ===<br />
* RK3399 system-on-chip (1)<br />
* LPDDR4 SDRAM (21)<br />
* SPI NOR flash memory (29)<br />
* eMMC flash memory (26)<br />
* WiFi/BT module (27)<br />
<br />
=== Mainboard Switches and Buttons ===<br />
There are two switches on the main board: disabling the eMMC (24), and enabling UART (9) via headphone jack. <br />
<br />
The Reset and Recovery buttons (28): the reset button performs an immediate reset of the laptop. The Recovery button is used to place the device in maskrom mode; this mode allows flashing eMMC using Rockchip tools (e.g. rkflashtools). <br />
<br />
[[File:PBPL_S.jpg]]<br />
<br />
=== Key Internal Parts ===<br />
{| class="wikitable"<br />
|+ Numbered parts classification and description<br />
! Number<br />
! Type<br />
! Descriptor<br />
|-<br />
! scope=row | 1<br />
| Component || RK3399 System-On-Chip<br />
|-<br />
! scope=row | 2<br />
| Socket || PCIe 4X socket for optional NVMe adapter<br />
|-<br />
! scope=row | 3<br />
| Socket || Speakers socket<br />
|-<br />
! scope=row | 4<br />
| Socket || Trackpad socket<br />
|-<br />
! scope=row | 5<br />
| Component || Left speaker <br />
|-<br />
! scope=row | 6<br />
| Connector || Male power bridge connector <br />
|-<br />
! scope=row | 7<br />
| Socket || Keyboard Socket<br />
|-<br />
! scope=row | 8<br />
| Component || Optional NVMe SSD adapter<br />
|-<br />
! scope=row | 9<br />
| Switch || UART/Audio switch - outputs UART via headphone jack<br />
|-<br />
! scope=row | 10<br />
| Socket || Female power bridge socket<br />
|-<br />
! scope=row | 11<br />
| Socket || Battery socket<br />
|-<br />
! scope=row | 12<br />
| Component || Trackpad<br />
|-<br />
! scope=row | 13<br />
| Component || Battery<br />
|-<br />
! scope=row | 14<br />
| Component || Right speaker<br />
|-<br />
! scope=row | 15<br />
| Socket || MicroSD card slot<br />
|-<br />
! scope=row | 16<br />
| Socket || Headphone / UART jack<br />
|-<br />
! scope=row | 17<br />
| Socket || USB 2.0 Type A<br />
|-<br />
! scope=row | 18<br />
| Socket || Daughterboard-to-mainboard ribbon cable socket<br />
|-<br />
! scope=row | 19<br />
| Cable || Daughterboard-to-mainboard ribbon cable<br />
|-<br />
! scope=row | 20<br />
| Component || microphone<br />
|-<br />
! scope=row | 21<br />
| Component || LPDDR4 RAM<br />
|-<br />
! scope=row | 22<br />
| Socket || Mainboard-to-daughterboard ribbon cable socket<br />
|-<br />
! scope=row | 23<br />
| Socket || Microphone socket<br />
|-<br />
! scope=row | 24<br />
| Switch || Switch to hardware disable eMMC<br />
|-<br />
! scope=row | 25<br />
| Antenna || BT/WiFI antenna<br />
|-<br />
! scope=row | 26<br />
| Component || eMMC flash memory module <br />
|-<br />
! scope=row | 27<br />
| Component ||BT/WiFi module chip<br />
|-<br />
! scope=row | 28<br />
| Buttons || Reset and recovery buttons<br />
|-<br />
! scope=row | 29<br />
| Component || SPI flash storage<br />
|-<br />
! scope=row | 30<br />
| Socket || eDP LCD socket<br />
|-<br />
! scope=row | 31<br />
| Socket || Power in barrel socket<br />
|-<br />
! scope=row | 32<br />
| Socket || USB 3.0 Type A<br />
|-<br />
! scope=row | 33<br />
| Socket || USB 3.0 Type C <br />
|}<br />
<br />
=== Smallboard detailed picture ===<br />
<br />
[[File:Pinebook_pro_smallboard.jpg]]<br />
<br />
== Bootable Storage ==<br />
The Pinebook Pro is capable of booting from eMMC, USB 2.0, USB 3.0, or an SD card. It cannot boot from USB-C. The boot order of the hard-coded ROM of its RK3399 SoC is: SPI NOR, eMMC, SD, USB OTG. <br />
<br />
At this time, the Pinebook Pro ships with a Manjaro + KDE build with [https://www.denx.de/wiki/U-Boot/ uboot] on the eMMC. Its boot order is: SD, USB, then eMMC.<br />
<br />
(An update has been pushed for the older Debian + MATE build that improves compatibility with booting other OSs from an SD card. In order to update, fully charge the battery, establish an internet connection, click the update icon in the toolbar, and then reboot your Pinebook Pro. Please see [https://forum.pine64.org/showthread.php?tid=7830 this log] for details.)<br />
<br />
Please note that PCIe, the interface used for NVMe SSD on the Pinebook Pro, is not bootable on the RK3399 and therefore is not a part of the boot hierarchy. It is possible to run the desired OS from NVMe by pointing extlinux on the eMMC to rootfs on the SSD. This requires uboot, the Kernel image, DTB, and extlinux.conf<br />
in a /boot partition on the eMMC.<br />
<br />
=== eMMC information ===<br />
The eMMC appears to be hot-pluggable. This can be useful if trying to recover data or a broken install. Best practice is probably to turn the eMMC switch to off position before changing modules.<br />
<br />
The eMMC storage will show up as multiple block devices:<br />
*mmcblk1boot0 - eMMC standard boot0 partition, may be 4MB<br />
*mmcblk1boot1 - eMMC standard boot1 partition, may be 4MB<br />
*mmcblk1rpmb - eMMC standard secure data partition, may be 16MB<br />
*mmcblk1 - This block contains the user areas<br />
<br />
Only the last is usable as regular storage device in the Pinebook Pro.<br />
The device number of "1" shown above may vary, depending on kernel.<br />
<br />
If the eMMC module is enabled after boot from an SD card, you can detect this change with the following commands as user "root";<br />
<pre><br />
echo fe330000.sdhci >/sys/bus/platform/drivers/sdhci-arasan/unbind<br />
echo fe330000.sdhci >/sys/bus/platform/drivers/sdhci-arasan/bind<br />
</pre><br />
<br />
=== Boot sequence details ===<br />
The RK3399's mask 32KB ROM boot code looks for the next stage of code at byte off-set 32768, (sector 64 if using 512 byte sectors). This is where U-Boot code would reside on any media that is bootable.<br><br />
[[RK3399_boot_sequence|RK3399 boot sequence]]<br />
<br />
== Pinebook Pro Dimensions ==<br />
* Dimensions: 329mm x 220mm x 12mm (WxDxH)<br />
* Weight: 1.26Kg<br />
* Screws<br />
** Philips head type screws<br />
** M2 flat head machine screws (measurements in mm)<br />
** 4 x Small screws (used along the front edge): Head - 3.44, Thread Diameter - 1.97, Thread Length - 2.1, Overall length - 3.05<br />
** 6 x Large screws: Head - 3.44, Thread Diameter - 1.97, Thread Length - 4.41, Overall Length - 5.85<br />
* Rubber Feet<br />
** 18mm diameter<br />
** 3mm height<br />
** Dome shaped<br />
<br />
== SoC and Memory Specification ==<br />
[[File:Rockchip_RK3399.png|right]]<br />
* Based on Rockchip RK3399<br />
<br />
=== CPU Architecture ===<br />
* big.LITTLE architecture: Dual Cortex-A72 + Quad Cortex-A53, 64-bit CPU<br />
** Full implementation of the ARM architecture v8-A instruction set (both AArch64 and AArch32)<br />
** ARM Neon Advanced SIMD (single instruction, multiple data) support for accelerated media and signal processing computation<br />
** ARMv8 Cryptography Extensions<br />
** VFPv4 floating point unit supporting single and double-precision operations<br />
** Hardware virtualization support<br />
** TrustZone technology support<br />
** Full CoreSight debug solution<br />
** One isolated voltage domain to support DVFS<br />
* Cortex-A72 (big cluster):<br />
** [https://developer.arm.com/products/processors/cortex-a/cortex-a72 Dual-core Cortex-A72 up to 2.0GHz CPU]<br />
** Superscalar, variable-length, out-of-order pipeline<br />
** L1 cache 48KB Icache and 32KB Dcache for each A72 <br />
** L2 cache 1024KB for big cluster <br />
* Cortex-A53 (little cluster):<br />
** [https://developer.arm.com/products/processors/cortex-a/cortex-a53 Quad-core Cortex-A53 up to 1.5GHz CPU]<br />
** In-order pipeline with symmetric dual-issue of most instructions <br />
** L1 cache 32KB Icache and 32KB Dcache for each A53<br />
** L2 cache 512KB for little cluster<br />
* Cortex-M0 (control processors):<br />
** [https://developer.arm.com/ip-products/processors/cortex-m/cortex-m0 Cortex-M0 CPU]<br />
** Two Cortex-M0 cooperate with the central processors<br />
** Architecture: Armv6-M<br />
** Thumb/Thumb2 instruction set<br />
** 32 bit only<br />
<br />
=== GPU Architecture ===<br />
* [https://developer.arm.com/products/graphics-and-multimedia/mali-gpus/mali-t860-and-mali-t880-gpus ARM Mali-T860MP4 Quad-core GPU]<br />
* The highest performance GPUs built on Arm Mali’s famous Midgard architecture, the Mali-T860 GPU is designed for complex graphics use cases and provide stunning visuals for UHD content.<br />
* Frequency 650MHz <br />
* Throughput 1300Mtri/s, 10.4Gpix/s <br />
* OpenGL® ES 1.1, 1.2, 2.0, 3.0, 3.1, 3.2., Vulkan 1.0*., OpenCL™ 1.1, 1.2., DirectX® 11 FL11_1., RenderScript™.<br />
<br />
=== System Memory ===<br />
* RAM Memory:<br />
** LPDDR4<br />
** Dual memory channels on the CPU, each 32 bits wide<br />
** Quad memory channels on the RAM chip, each 16 bits wide, 2 bonded together for each CPU channel<br />
** 4GB as a single 366 pin mobile RAM chip<br />
* Storage Memory: <br />
** 64GB eMMC module, can be upgraded to a 128GB eMMC module. (The initial PINE64 community build version shipped with a 128GB eMMC.)<br />
** eMMC version 5.1, HS400, 8 bit on RK3399 side<br />
<br />
=== Video out ===<br />
* USB-C Alt mode DP<br />
* Up to 3840x2160 p60, dependant on adapter, (2 lanes verses 4 lanes)<br />
<br />
=== Expansion Ports ===<br />
* MicroSD card:<br />
** Bootable<br />
** Supports SD, SDHC and SDXC cards, up to 512GB tested. SDXC standard says 2TB is the maximum.<br />
** Version SD3.0, (MMC 4.5), up to 50MB/s<br />
** SD card Application Performance Class 1 (A1), (or better), recommended by some users, for better IOPS<br />
* USB ports:<br />
** 1 x USB 2.0 Type-A Host Port, bootable<br />
** 1 x USB 3.0 Type-A Host Port, 5Gbps, is not bootable<br />
** 1 x USB 3.0 Type-C OTG Port, 5Gbps, (includes laptop charging function), is not bootable<br />
** Note that high power USB devices may not work reliably on a PBP. Or they may draw enough power to drain the battery even when the PBP is plugged into A.C. One alternative is externally powered USB devices.<br />
* Headphone jack switchable to UART console mux circuit<br />
<br />
== Additional hardware ==<br />
Hardware that is not part of the SoC.<br />
<br />
=== Battery ===<br />
* Lithium Polymer Battery (10,000 mAH)<br />
<br />
=== Display ===<br />
* 14.0" 1920x1080 IPS LCD panel<br />
<br />
=== Audio ===<br />
* 3.5mm stereo earphone/microphone plug<br />
* Built-in stereo speakers:<br />
** Oval in design<br />
** 3 mm high x 20 mm x 30 mm<br />
<br />
=== Network ===<br />
* WiFi:<br />
** 802.11 b/g/n/ac<br />
** Dual band: 2.4Ghz & 5Ghz<br />
** Single antenna<br />
* Bluetooth 5.0<br />
<br />
=== Optional NVMe adapter ===<br />
* PCIe 2.x, 5GT/s per lane<br />
* 4 PCIe lanes, can not be bifurcated, (however, can be used with 1 or 2 lane NVMe cards)<br />
* '''M''' keyed, though '''M'''+'''B''' keyed devices will work too<br />
* Maximum length for M.2 card is 80mm (M.2 2280). The following sizes will also work: 2230, 2242, 2260<br />
* Power: 2.5W continuous, 8.25W peak momentary<br />
* Does not support SATA M.2 cards<br />
* Does not support USB M.2 cards<br />
<br />
== Pinebook Pro Schematics and Certifications ==<br />
* Pinebook Pro Main Board Schematic And Silkscreen:<br />
** [http://files.pine64.org/doc/PinebookPro/pinebookpro_v2.1_mainboard_schematic.pdf Pinebook Pro Main Board ver 2.1 Schematic]<br />
** [https://wiki.pine64.org/images/3/30/Pinebookpro-v2.1-top-ref.pdf Pinebook Pro ver 2.1 Top Layer Silkscreen]<br />
** [https://wiki.pine64.org/images/b/b7/Pinebookpro-v2.1-bottom-ref.pdf Pinebook Pro ver 2.1 Bottom Layer Silkscreen]<br />
* Pinebook Pro Daughter Board Schematic:<br />
** [http://files.pine64.org/doc/PinebookPro/pinebookpro_v2.1_daughterboard_schematic.pdf Pinebook Pro Daughter Board ver 2.1 Schematic]<br />
* Optional Pinebook Pro NVMe Adapter Schematic:<br />
** [http://files.pine64.org/doc/PinebookPro/pinebookpro_v2.1_NVMe-adapter_schematic.pdf Pinebook Pro NVMe Adapter Board ver 2.1 Schematic]<br />
* Serial Console Earphone Jack Pinout:<br />
** [http://files.pine64.org/doc/pinebook/guide/Pinebook_Earphone_Serial_Console_Developer_Guide.pdf Pinkbook Serial Console Earphone Jack Pinout]<br />
* Pinebook Pro Case:<br />
** [https://send.firefox.com/download/b34c14f3e0a3d66d/#15Cx1vBaGKmJr57y85U2qQ AutoCAD DWG File]<br />
* Pinebook Pro Certifications:<br />
** [http://files.pine64.org/doc/cert/Pinebook%20Pro%20FCC%20Certificate-S19071103501001.pdf Pinebook Pro FCC Certificate]<br />
** [http://files.pine64.org/doc/cert/Pinebook%20Pro%20CE%20RED%20Certificate-S19051404304.pdf Pinebook Pro CE Certificate]<br />
** [http://files.pine64.org/doc/cert/Pinebook%20Pro%20ROHS%20Compliance%20Certificate.pdf Pinebook Pro RoHS Certificate]<br />
<br />
== Datasheets for Components and Peripherals ==<br />
* Rockchip RK3399 SoC information:<br />
** [http://www.rock-chips.com/a/en/products/RK33_Series/2016/0419/758.html Rockchip RK3399 SoC Brief]<br />
** [http://opensource.rock-chips.com/images/d/d7/Rockchip_RK3399_Datasheet_V2.1-20200323.pdf Rockchip RK3399 Datasheet V2.1]<br />
** [http://opensource.rock-chips.com/images/e/ee/Rockchip_RK3399TRM_V1.4_Part1-20170408.pdf Rockchip RK3399 Technical Reference Manual part 1]<br />
** [http://files.pine64.org/doc/datasheet/rockpro64/RK808%20datasheet%20V0.8.pdf Rockchip RK808 Datasheet V0.8]<br />
* LPDDR4 (366 Balls) SDRAM:<br />
** [http://files.pine64.org/doc/datasheet/PinebookPro/micron%20SM512M64Z01MD4BNK-053FT%20LPDDR4%20(366Ball).pdf Micron 366 balls Mobile LPDDR4 Datasheet]<br />
* eMMC information:<br />
** [http://files.pine64.org/doc/rock64/PINE64_eMMC_Module_20170719.pdf PINE64 eMMC module schematic]<br />
** [http://files.pine64.org/doc/rock64/usb%20emmc%20module%20adapter%20v2.pdf PINE64 USB adapter for eMMC module V2 schematic]<br />
** [http://files.pine64.org/doc/rock64/USB%20adapter%20for%20eMMC%20module%20PCB.tar PINE64 USB adapter for eMMC module PCB in JPEG]<br />
** [http://files.pine64.org/doc/datasheet/pine64/SDINADF4-16-128GB-H%20data%20sheet%20v1.13.pdf SanDisk eMMC Datasheet]<br />
* SPI NOR Flash information:<br />
** [http://files.pine64.org/doc/datasheet/pine64/w25q128jv%20spi%20revc%2011162016.pdf WinBond 128Mb SPI Flash Datasheet]<br />
** [https://wiki.pine64.org/images/b/b9/Ds-00220-gd25q127c-rev1-df2f4.pdf GigaDevice 128Mb SPI Flash Datasheet (UPDATED)]<br />
* Wireless related info:<br />
** [http://files.pine64.org/doc/datasheet/PinebookPro/AP6256%20datasheet_V1.7_12282018.pdf AMPAK AP6256 11AC Wi-Fi + Bluetooth5 Datasheet]<br />
* Audio Codec (ES8316)<br />
** [http://everest-semi.com/pdf/ES8316%20PB.pdf Everest ES8316 Audio Codec]<br />
* LCD Panel:<br />
** [http://files.pine64.org/doc/datasheet/PinebookPro/NV140FHM-N49_Rev.P0_20160804_201710235838.pdf 14" 1920x1080 IPS LCD Panel datasheet]<br />
* Internal USB 2 hub:<br />
** [https://wiki.pine64.org/images/3/39/GL850G_USB_Hub_1.07.pdf GL850G USB Hub 1.07.pdf]<br />
* Touchpad information:<br />
** [http://files.pine64.org/doc/datasheet/PinebookPro/YX%20HK-9562%20HID%20I2C%20Specification.pdf Touchpad Specification for Pinebook Pro model]<br />
* Keyboard information:<br />
** [https://wiki.pine64.org/images/b/b0/SH68F83V2.0.pdf Sinowealth SH68F83 Datasheet]<br />
** US ANSI: XK-HS002 MB27716023<br />
* Full HD Camera sensor:<br />
** [http://files.pine64.org/doc/datasheet/PinebookPro/HK-2145-263.pdf Full HD Camera module specification in Chinese]<br />
** [http://files.pine64.org/doc/datasheet/PinebookPro/GC2145%20CSP%20DataSheet%20release%20V1.0_20131201.pdf GalaxyCore GC2145 Full HD Camera Sensor Data Sheet]<br />
* Lithium Battery information:<br />
** [http://files.pine64.org/doc/datasheet/pinebook/40110175P%203.8V%2010000mAh规格书-14.pdf 10000mAH Lithium Battery Specification for 14" model]<br />
* NVMe adapter:<br />
** [https://datasheet.octopart.com/FH26W-35S-0.3SHW%2860%29-Hirose-datasheet-26676825.pdf Compatible, not OEM! Use FH26-35S-0.3SHW flat flex connector]<br />
<br />
== Versions ==<br />
Pinebook Pro v1 and v2 were prototype models that did not make it to the public. The "first batch" (First 100 forum preorders) onward are v2.1. [https://forum.pine64.org/showthread.php?tid=8111] <br />
<br />
=Skinning and Case Customization=<br />
* Template files for creating custom skins. Each includes template layers for art placement, and CUT lines.<br />
**[https://drive.google.com/open?id=1UKFlC53DO0GJm3Hz1E_669n_HhI45e4n Case Lid Template]<br />
**[https://drive.google.com/open?id=1Q6bKGarMDhvWz3HdGvhL5qDhyHb546ve Case Bottom Template]<br />
**[https://drive.google.com/open?id=1ugI74ygNJ3EN5jXks5jKvdpEAoxIzHo4 Case Palmrest Template]<br />
<br />
= Other Resources =<br />
* [https://forum.pine64.org/forumdisplay.php?fid=111 Pinebook Pro Forum]<br />
* [https://forum.pine64.org/forumdisplay.php?fid=98 ROCKPro64 Forum]<br />
* [[RockPro64 Guides]]<br />
* [https://riot.im/app/#/room/#pinebook:matrix.org Matrix Channel] (No login required to read)<br />
* IRC Server: irc.pine64.org Channel: PineBook<br />
* [https://discordapp.com/channels/463237927984693259/622348681538043924 Discord Channel]<br />
* [https://github.com/rockchip-linux Rockchip Linux GitHub Repo]<br />
* [http://opensource.rock-chips.com/ Rockchip Open Source Wiki]</div>Newton688https://wiki.pine64.org/index.php?title=PinePhone_Software_Releases&diff=5463PinePhone Software Releases2020-04-14T19:04:12Z<p>Newton688: /* Virgin Mobile (Canada) */</p>
<hr />
<div><br />
This page is intended to help you install a software release on your [[PinePhone]]. It also provides details about all available releases as well as links to further resources.<br />
<br />
= General instructions =<br />
<br />
Releases are first installed to a Micro SD card. Choose a card with fast I/O (of small files) for the best performance. See [[#Other Resources]] for performance tests of various SD cards.<br />
<br />
This section has generic installation instructions. Please see the [[#Software Releases]] section for specific installation instructions for each distribution.<br />
<br />
== Boot priority ==<br />
<br />
The default PinePhone boot priority is first the SD card and then the eMMC so inserting your own SD card with your preferred release will result in the phone booting your image. <br />
<br />
User ''megi'' has demonstrated his multi-boot development [https://www.youtube.com/watch?v=ZL1GREqoqx8 on YouTube,] see Other Resouces at the bottom of this page for a link to his notes.<br />
<br />
== Preparation of SD card ==<br />
# Download your chosen image from the options below<br />
# Extract the compressed file<br />
# Write the image to your SD card<br />
# Plug SD card into phone<br />
# Boot phone<br />
If you need step-by-step instructions for writing an image to an SD card, check [[NOOB#Step-by-Step_Instructions_to_Flashing_MicroSD_Cards]] then return to this page.<br />
<br />
== Installation to eMMC (Optional) ==<br />
<br />
=== Method Using Factory Installed Tools ===<br />
<br />
The initial OS you get with your phone has the option to flash an image on SD card to eMMC. <br />
<br />
# Copy (not flash) the image file to a formatted SD card.<br />
# Insert SD card into powered-off phone.<br />
# Turn on phone and select option to install to eMMC.<br />
<br />
=== Safe & Easy Method ===<br />
<br />
[https://forum.pine64.org/showthread.php?tid=9444 Jumpdrive thread]<br />
<br />
# download and extract [https://github.com/dreemurrs-embedded/Jumpdrive/releases the Jumpdrive image]<br />
# flash the Jumpdrive image to a SD card<br />
# boot from the SD card<br />
# connect the PinePhone to your computer using USB-A -> USB-C cable.<br />
# flash the exposed (mounted) PinePhone drive with a chosen OS image as you'd flash any SD card, and resize partitions (optional, see below)<br />
# disconnect the PinePhone from your PC, power it down and remove the Jumpdrive SD card<br />
# boot into your OS of choice on eMMC<br />
<br />
The Jumpdrive image is smaller than 50MB. You can keep an SD card specifically for using Jumpdrive, and there are 64MB micro SD cards sold cheaply that will suffice.<br />
<br />
=== Safe With No Extra Tools, But Slower ===<br />
<br />
# Prepare a formatted SD card, flash desired OS to the SD card, and (optionally) resize the partition (see below)<br />
# Insert SD card and boot the phone<br />
# Open terminal and <code>git clone [url]</code> your desired project OR: Open web browser and download the desired OS image file. <br />
# Build the OS (Optional)<br />
# Flash the resulting image file you got by downloading or by building, to eMMC, using <code>dd if=/dev/mmcblkX of=/dev/mmcblkY bs=1M</code> where X is the number label of the SD card, and Y is the number label of the eMMC. Use the command ''lsblk'' to check your devices: typically with the current kernel the SD card is /dev/mmcblk0 and the eMMC is /dev/mmcblk2 but as always with ''dd'' be extremely cautious to get the devices correct. Then, resize partition to fill up entire disk (see below).<br />
# Turn off phone, remove SD card. Turn on phone.<br />
<br />
=== Risky Method ===<br />
<br />
Warning: This copies a mounted filesystem, which can lead to instability, erratic behavior, and data corruption. Do not use long term.<br />
<br />
# Prepare a new SD card, flash desired OS to the SD card<br />
# Boot the phone with your new SD card image<br />
# Within the booted OS, flash/clone the running OS to eMMC, e.g. using dd. It will take about 15 minutes (depending on the speed of your card), and in the end it may show an error about not enough space - just ignore it.<br />
# Turn off phone, take out SD card, and try booting the phone which should load up the new OS from eMMC.<br />
# Open terminal and resize partition to fill up entire disk (see below).<br />
<br />
== Resize partition to fit disk space ==<br />
<br />
Once you've flashed the OS to your SD card or eMMC storage, you may also need to expand the partition to fill all the available space.<br />
<br />
=== Resize SD card's partition using computer ===<br />
<br />
For SD cards, insert the SD card and resize the partitions through the computer. For eMMC, insert the phone cable and use Jumpdrive to access the eMMC directly, and resize the partition after flashing the image.<br />
<br />
Using Growpart: <br />
<br />
growpart /dev/sdX 1 <br />
resize2fs /dev/sdX 1<br />
<br />
Locate growpart (<code>apt-cache search growpart</code> and install the package in the search results) and run: <br />
growpart /dev/mmcblkX Y<br />
resize2fs /dev/mmcblkXpY<br />
where X is the storage device and Y is the partition number (viewable from lsblk).<br />
<br />
If you get any errors about missing or unknown commands, use apt-cache search to find and install the needed software. Also don't forget to use sudo.<br />
<br />
Using Parted: <br />
<br />
Parted's interactive mode and resize work well together. Do this before you put your SD card into the PinePhone for the first time for best results.<br />
<br />
sudo parted /dev/<your_sd_card_device><br />
(parted) resizepart 2 100%<br />
(parted) quit<br />
sudo resize /dev/<the_second_sd_card_PARTITION><br />
<br />
=== Resize from within PinePhone: ===<br />
<br />
eMMC: you would need to resize the partition on eMMC (flashed with the operating system) by booting another image from the SD card: that way, the eMMC will be unmounted. It is '''not recommended''' to resize eMMC while booted from eMMC! Resizing a currently mounted partition can have weird results. <br />
<br />
SD card: It is generally not possible to boot from eMMC to partition the unmounted SD card, because of the boot order -- you would have to write the image to the empty SD card first, then resize partition, all without rebooting. It is also '''not recommended''' to resize the SD card while booted from SD card! Resizing a currently mounted partition can have weird results.<br />
<br />
== Installing Any ARM64 Distribution ==<br />
'''Warning:''' Distributions not on this page may not even boot after you follow this section. In the best case, they will be barely usable. This is more for fun, or if you would like to port a new distro to the PinePhone.<br />
<br />
'''Note:''' This section uses megi's kernel releases, and not the official ones from PINE64. While it is possible to use the official (and in the future, mainline) kernel, megi provides binary releases, which makes it very easy.<br />
<br />
If you would like to see specific commands for how to complete these steps, see https://github.com/nikhiljha/pp-fedora-sdsetup (an example for Fedora) or https://xnux.eu/howtos/install-arch-linux-arm.html (an example for Arch Linux).<br />
<br />
# Create a boot (from 4MiB to about 252MiB) and root (from 252 MiB to the end of the card) filesystem on the SD card.<br />
# Format the boot partition with vfat, and the root partition with f2fs.<br />
# Extract the root filesystem from your distro's ARM image into the root filesystem on the sd card. Do not copy the partition, copy the files (in archive mode: like rsync -ar).<br />
# Edit /etc/fstab to match your partitions.<br />
# Grab megi's kernel from https://xff.cz/kernels/ (you probably want 5.6).<br />
# Follow the README instructions, which involves copying the kernel modules into the sd card rootfs and writing u-boot and the bootloader.<br />
<br />
== Backlight ==<br />
All current distributions do not have a good setup for the backlight at low brightness.<br />
If configured too low, the backlight shuts down completely, but the screen is still displayed and usable in bright front-light.<br />
<br />
Sailfish is one OS that initially uses automatic backlight control and the default setting makes the screen appear blank. When shining a bright light on the screen, you can still navigate the screen (and maybe the screen switches on temporarily due to the light sensor). This will make it possible for you to disable auto brightness in Settings, Display.<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
= Software Releases =<br />
<br />
== postmarketOS ==<br />
http://files.pine64.org/sw/pine64_installer/json/postmarketos.png<br />
<br />
postmarketOS is a preconfigured version of [https://www.alpinelinux.org/ Alpine Linux] for mobile devices that offers a choice of several desktop environments including Plasma Mobile and phosh. <br />
<br />
* ''' Download location '''<br />
Rather than downloading a demo image [https://wiki.postmarketos.org/wiki/Installation_guide postmarketOS recommend] the use of their script, pmbootstrap, that can tailor build your SD card for you. See for example [https://forum.pine64.org/showthread.php?tid=8285 this forum thread.] <br />
<br />
Note pmbootstrap offers an option [https://wiki.postmarketos.org/wiki/PINE64_PinePhone_(pine64-pinephone)#Installation to install to the eMMC.]<br />
<br />
[http://images.postmarketos.org/pinephone/ Demo images can be found here.]<br />
<br />
* ''' user-id/password '''<br />
demo/147147 (for demo images only - when building an image with pmbootstrap you set your own user-id and password. '''NOTE: currently some lock screens require your password, but only present a numeric keyboard. So you should use [[only]] numbers in your password until you've verified you can unlock with other characters.''')<br />
<br />
* ''' What works, what does not work '''<br />
See [https://wiki.postmarketos.org/wiki/PINE64_PinePhone_(pine64-pinephone) postmarketOS dedicated PinePhone wiki page]<br />
<br />
If you install firefox browser (or are using a demo image that already has it installed) [https://wiki.postmarketos.org/wiki/Firefox then these hints in the pmOS wiki are recommended.] GDK_SCALE=1 is best for PinePhone screen, enable Wayland gets application to fit screen and allow keyboard entry.<br />
<br />
* ''' Where/how to report defects '''<br />
[https://gitlab.com/postmarketOS/postmarketos/issues/3 postmarketOS issue tracker for PinePhone support]<br />
<br />
* ''' Contributions '''<br />
[https://wiki.postmarketos.org/wiki/Contributing See postmarketOS wiki for options to contribute.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Ubuntu Touch by UBPorts ==<br />
http://files.pine64.org/sw/pine64_installer/json/ubports.png<br />
A Mobile Version of the Ubuntu Operating System made and maintained by the UBports Community.<br />
<br />
A short, state-of-the-art (as at 2 April 2020) [https://youtu.be/3Ne6G0-hn9g demo on YouTube.]<br />
<br />
[https://ubuntu-touch.io/ Ubuntu touch] is a mobile version of Ubuntu developed by the UBports community. Images can be downloaded from [https://ci.ubports.com/job/rootfs/job/rootfs-pinephone/ here]. There is also a [https://github.com/goddard/pinephone/ script] to download the latest img and flash to your pinephone. In the future, Ubuntu Touch will be able to be installed onto the PinePhone with the [https://ubuntu-touch.io/get-ut UBports installer] GUI tool.<br />
<br />
* ''' Download location '''<br />
[https://gitlab.com/ubports/community-ports/pinephone See UBports gitlab page.]<br />
<br />
* ''' user-id/password '''<br />
The default password is <code>phablet</code><br />
<br />
* ''' What works, what does not work '''<br />
[https://gitlab.com/ubports/community-ports/pinephone Scroll down to the bottom of this page.]<br />
<br />
* ''' Where/how to report defects '''<br />
[https://gitlab.com/ubports/community-ports/pinephone See UBports gitlab page.]<br />
<br />
* ''' Contributions '''<br />
[https://ubports.com/foundation/sponsors See UBports website for how to donate.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Debian-PinePhone ==<br />
[[File:Debian-logo.png]]<br />
<br />
See [https://forum.pine64.org/showthread.php?tid=9016 this thread in the forum.]<br />
<br />
An unofficial Debian build for ARM64 running with the [https://developer.puri.sm/Librem5/Software_Reference/Environments/Phosh.html phosh user interface] (developed by Purism, phosh uses [https://en.wikipedia.org/wiki/Wayland_(display_server_protocol) Wayland] instead of [https://en.wikipedia.org/wiki/X.Org_Server Xorg]). The base system is pure Debian, with only the GUI applications and a few others (ModemManager, Wifi chip firmware) being built from modified sources (as well as the kernel and u-boot, of course). Current version is Debian Bullseye. <br />
<br />
* ''' Download location '''<br />
[http://pinephone.a-wai.com/images/ Debian-pinephone downloadable images]<br />
<br />
Don't forget to extract the image before installing: <br />
$ gzip -d Downloads/debian-pinephone-*.img.gz<br />
<br />
See the [https://gitlab.com/a-wai/debos-pinephone project page] for specific installation instructions.<br />
<br />
* ''' user-id/password '''<br />
debian/1234<br />
<br />
* ''' What works, what does not work '''<br />
<br />
See [https://gitlab.com/a-wai/debos-pinephone/-/blob/master/README.md the project's README file] for most up to date status.<br />
<br />
Check [https://gitlab.com/a-wai/debos-pinephone/-/issues bug tracker] for known issues. Questions? Ask on our [https://forum.pine64.org/showthread.php?tid=9016 thread].<br />
<br />
* ''' Where/how to report defects '''<br />
It is recommended that you log your bug reports in [https://gitlab.com/a-wai/debos-pinephone/issues/ the project's issue tracker.] As a general rule, issues with third-party apps (even the default ones) should be reported upstream. A Debian-PinePhone issue would be related to getting the hardware to work on the PinePhone, but if unsure where the issue should be reported just open a ticket or ask.<br />
<br />
* ''' Contributions '''<br />
Feel free to pick an open issue to work on, or send a merge request on [https://gitlab.com/a-wai/debos-pinephone/ Gitlab.]<br />
<br />
* ''' User Experience Notes '''<br />
<br />
If not already mentioned on the project page, the [https://forum.pine64.org/showthread.php?tid=9016 thread] might have known workarounds to software and user experience issues as contributed by the users.<br />
<br />
'''The Chatty app''' requires that in order to start a new text, you need to enter +[country code]-[phone number]. Without the + and the country code (+1 for USA) you won't be able to send a new text. <br />
<br />
'''To adjust screen resolution''' [https://puri.sm/posts/easy-librem-5-app-development-scale-the-screen/] [https://forum.pine64.org/showthread.php?tid=9016&pid=61403#pid61403] [https://forum.pine64.org/showthread.php?tid=9016&pid=61685#pid61685] <br />
<br />
# <code>sudo apt install linux-libc-dev build-essential ninja-build meson cmake libwayland-dev</code><br />
# Continue the rest of the instructions on [https://puri.sm/posts/easy-librem-5-app-development-scale-the-screen/ this page]<br />
# When you finish, you will have a touch-capable app you can use to adjust resolution any time, useful when switching between various apps. Unlike other solutions, this works across reboots.<br />
<br />
'''Most of Debian's repository is available.''' There are packages that apt won't find, which need to be cross compiled ("ported") to ARM64 (see [https://wiki.debian.org/Arm64Port Debian's wiki on ARM64 port]), but the process is fairly easy. Most developers package their software for the AMD64 version of Debian, so they will throw an error when run; if you have the source code, you can compile it to run on ARM64/PinePhone. If you do so, you should contact the developers so they can provide precompiled ARM64 packages for others in the future. You should also contact Debian if you have working ARM64 packages not listed on [https://wiki.debian.org/Arm64Port this page], since this helps them track the status of ARM64 with Debian. Give their wiki page some TLC.<br />
<br />
'''Apps that don't work with Wayland''': if you encounter an app that only works with X11 and not Wayland, report it upstream to the app's developers.<br />
<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== SailfishOS ==<br />
http://files.pine64.org/sw/pine64_installer/json/sailfishos.png<br />
Sailfish OS is a Linux-based operating system based on open source projects such as Mer and including a closed source UI.<br />
<br />
* ''' Download location '''<br />
The SailfishOS image is built on Gitlab CI. The latest image can be installed using the [https://raw.githubusercontent.com/sailfish-on-dontbeevil/flash-it/master/flash-it.sh flashing script].<br />
<br />
The script downloads the image and bootloader from our CI, extracts everything and burns it onto the SD card. '''Note:''' The script will format and erase the SD card!<br />
<br />
''Instructions:''<br />
# Download the flashing script<br />
# Insert a microSD card in your device<br />
# Make the script executable: <code>chmod +x flash-it.sh</code><br />
# Verify that you have the <code>bsdtar</code> package installed<br />
# Execute it: <code>./flash-it.sh</code><br />
# Follow the instructions. Some commands in the script require root permissions (for example: mounting and flashing the SD card).<br />
<br />
Note that after baking µSD card and booting phone, as per [https://www.reddit.com/r/pinephone/comments/f1l7bm/sailfish_os_on_pinephone_best_os_so_far_in_my/fh8o0s2/ Reddit comment] you have to adjust autobrightness settings in order to actually see interface.<br />
<br />
* ''' user-id/password '''<br />
<br />
* ''' What works, what does not work '''<br />
The current (9 Apr) version of Sailfish no longer has a defect with the auto brightness! To install on an sdcard you need to have a sim card in the phone - doesn't have to be an active one. If you then unset the device lock password in settings you can take the simcard out. If you're not familiar with SFOS, be prepared by having a (free) Jolla account and pay attention to the tutorial - the interface works great but isn't immediately obvious. If you are familiar with SF you can skip the tutorial by touching all 4 corners starting top left. There is a poor selection of apps available from the Jolla store. To enable openrepos.net install the Storeman app by downloading the rpm from https://openrepos.net/content/osetr/storeman. You may need to get the rpm onto the phone by other means as the built in browser is not working at the moment. Press the rpm file and you will be asked to install it. Once you have the Storeman app installed browse the store and add repository first from the pulley menu before installing the app you want. The Webcat browser on openstore works.<br />
<br />
* ''' Where/how to report defects '''<br />
<br />
See [https://sailfishos.org/wiki/Collaborative_Development#Reporting_issues the Sailfish wiki] for links to their forum as well as info required when reporting an issue.<br />
<br />
* ''' Contributions '''<br />
[https://sailfishos.org/wiki/SailfishOS See the SailfishOS wiki for options to contribute.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
<br />
== PureOS ==<br />
PureOS is a GNU/Linux distribution focusing on privacy and security, using the GNOME desktop environment. It is developed and maintained by Purism.<br />
<br />
* ''' Download location '''<br />
This is an unofficial/unsupported creation by ''mozzwald'' that can be downloaded [http://pureos.ironrobin.net/droppy/#/Images here.] <br />
<br />
* ''' user-id/password '''<br />
purism/123456<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Fedora ==<br />
http://files.pine64.org/sw/pine64_installer/json/fedora.png<br />
<br />
An (unofficial) vanilla Fedora rawhide build for aarch64 with megi's kernel and [https://copr.fedorainfracloud.org/coprs/njha/mobile/packages/ some additional packages] to tie it all together. It aims to eventually be an upstream part of the Fedora project, rather than a phone-specific distribution.<br />
<br />
See [https://forum.pine64.org/showthread.php?tid=9347 this thread in the forum.]<br />
<br />
* ''' Download location '''<br />
[https://github.com/nikhiljha/pp-fedora-sdsetup/releases/ flashable images] or [https://github.com/nikhiljha/pp-fedora-sdsetup/tree/image build scripts]<br />
<br />
The images are compressed with zstd because the maintainer needs an excuse to use zstd.<br />
<br />
* ''' user-id/password '''<br />
pine/1111<br />
<br />
* ''' What works, what does not work '''<br />
* WiFi, Bluetooth, SMS, Data, Calls all work!<br />
* There are still a few bugs though, and [some features don't have driver support yet https://xnux.eu/devices/pine64-pinephone.html#toc-feature-driver-support-matrix] on any PinePhone distribution. <br />
<br />
* ''' Where/how to report defects '''<br />
Please send your bug reports at [https://github.com/nikhiljha/pp-fedora-sdsetup/issues the project's issue tracker.] Be sure to include logs if applicable!<br />
<br />
* ''' Contributions '''<br />
Please help! Send us merge requests on [https://github.com/nikhiljha/pp-fedora-sdsetup/ Github.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Arch Linux ARM ==<br />
http://files.pine64.org/sw/pine64_installer/json/archlinux.png<br />
<br />
An (unofficial) barebone Arch Linux ARM image, all you have is just a shell and SSH.<br />
<br />
* ''' Download location '''<br />
https://github.com/dreemurrs-embedded/Pine64-Arch/releases<br />
<br />
* ''' user-id/password '''<br />
alarm/alarm<br />
<br />
* ''' What works, what does not work '''<br />
It's just fast and smooth, there's nothing, you'll have to install a desktop at your own. GNOME is a good example to look at.<br />
<br />
To access the device, ssh to 172.16.42.1 with the credentials above. <br />
<br />
* ''' Contributions '''<br />
Feel free to send us merge requests on [https://github.com/dreemurrs-embedded/Pine64-Arch/pulls GitHub.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Manjaro ARM ==<br />
http://files.pine64.org/sw/pine64_installer/json/manjaro.png<br />
Manjaro is a user-friendly Linux distribution based on the independently developed Arch operating system with the Plasma Mobile desktop environment.<br />
<br />
* ''' Download location '''<br />
[https://forum.manjaro.org/t/manjaro-arm-alpha4-for-pinephone-and-pinetab/127684 See Manjaro forum announcement of Alpha4 version]<br />
<br />
* ''' user-id/password '''<br />
** manjaro/1234<br />
** root/root<br />
<br />
* ''' What works, what does not work '''<br />
[https://forum.manjaro.org/t/manjaro-arm-alpha4-for-pinephone-and-pinetab/127684 See Manjaro announcement.]<br />
<br />
In particular phone calls do NOT yet work from the Phone application.<br />
<br />
* ''' Where/how to report defects '''<br />
<br />
* ''' Contributions '''<br />
[https://forum.manjaro.org/t/manjaro-arm-alpha4-for-pinephone-and-pinetab/127684 See the end of the announcement here.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Maemo Leste ==<br />
http://files.pine64.org/sw/pine64_installer/json/maemo_leste.png<br />
<br />
[https://en.wikipedia.org/wiki/Maemo Maemo] is a trimmed-down version of Debian for mobile devices, originally a collaboration between Nokia and many open source projects (the [http://maemo.org/intro/ Maemo community]) before Nokia abandoned it. The community now takes full responsibility in developing fully open source Maemo for a variety of mobile devices. <br />
<br />
The new version Maemo 7 "Leste" is an ARM64 port of [https://devuan.org/ Devuan] (Debian without systemd) and runs the mainline Linux kernel. The default user interface stack is [https://en.wikipedia.org/wiki/Hildon Hildon], [https://en.wikipedia.org/wiki/X.Org_Server Xorg], [https://en.wikipedia.org/wiki/Matchbox_(window_manager) Matchbox WM], and [https://en.wikipedia.org/wiki/GTK GTK]. The current version is Devuan Ascii (Debian Stretch) and they are working on an upgrade to Devuan Beowulf (Debian Buster) as well as simultaneous support for both Devuan and Debian. In addition to the main repository, they [https://maemo-leste.github.io/maemo-leste-repositories-and-community-packages.html announced] a [https://github.com/maemo-leste-extras/bugtracker community repository]. To keep updated they use automation in their package maintenance with [https://github.com/maemo-leste/jenkins-integration jenkins] (similar to [https://www.debian.org/devel/buildd/ debian's buildd]). Porting packages to Maemo Leste is basically a simple matter of porting to arm64 version of Debian/Devuan, which benefits both projects.<br />
<br />
More detailed information can be found on [https://leste.maemo.org/Main_Page the Maemo Leste wiki], or follow [https://maemo-leste.github.io/ announcements on their website], and check out [https://leste.maemo.org/Leste_FAQ Frequently Asked Questions]. <br />
<br />
* ''' Download location '''<br />
[http://maedevu.maemo.org/images/pinephone/ Maemo Leste test builds.] There is also an [https://github.com/maemo-leste/image-builder image builder], see their wiki for instructions on how to [https://leste.maemo.org/Image_Builder build a custom image].<br />
<br />
* ''' user-id/password '''<br />
root/toor<br />
<br />
You may use "sudo" directly.<br />
<br />
* ''' What works, what does not work '''<br />
For current status and work arounds please read their [https://leste.maemo.org/PinePhone PinePhone wiki page], and update as necessary (make sure to notify them of new issues by leaving a report on their github, see below).<br />
<br />
* ''' Where to Report Issues '''<br />
Most discussion occurs at #maemo-leste on freenode IRC. The Maemo website also has an [https://talk.maemo.org/showthread.php?p=1565822 ongoing forum thread] for feedback about Maemo Leste on the PinePhone BraveHeart edition.<br />
<br />
All other contact information is listed on the [https://leste.maemo.org/Main_Page main page] of the Maemo wiki. You should [https://github.com/maemo-leste/bugtracker/issues submit bug reports] on github. To track known issues, you may use these search terms: [https://github.com/maemo-leste/bugtracker/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen+pinephone pinephone], [https://github.com/maemo-leste/bugtracker/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen+pine64 pine64]<br />
<br />
* ''' Development '''<br />
Learn about [https://leste.maemo.org/Development development], [https://leste.maemo.org/Development/Porting_Packages porting packages], [https://leste.maemo.org/Development/Building_Packages building packages], [https://leste.maemo.org/Development/Tasks todo list], and general info on [https://wiki.debian.org/HowToPackageForDebian how to package for Debian]. Some tasks have funding available. <br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Nemo Mobile ==<br />
http://files.pine64.org/sw/pine64_installer/json/nemo_mobile.png<br />
Nemo Mobile is the open source build of Sailfish OS.<br />
<br />
See [https://forum.pine64.org/showthread.php?tid=9043 this forum thread] for how to get going.<br />
<br />
* ''' Download location '''<br />
[https://github.com/neochapay/nemo-device-dont_be_evil/ Download location is here on GitHub.]<br />
<br />
* ''' user-id/password '''<br />
<br />
* ''' What works, what does not work '''<br />
[https://github.com/neochapay/nemo-device-dont_be_evil/ Scroll down the page here.]<br />
<br />
* ''' Where/how to report defects '''<br />
For more info please visit [https://github.com/neochapay/nemo-device-dont_be_evil neochapay's github page]<br />
<br />
* ''' Contributions '''<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== KDE Neon ==<br />
http://files.pine64.org/sw/pine64_installer/json/plasma_mobile.png<br />
Based on KDE Neon for the desktop, comes with Plasma Mobile.<br />
<br />
* ''' Download location '''<br />
[https://images.plasma-mobile.org/pinephone/ Plasma mobile images can be found here.]<br />
<br />
* ''' user-id/password '''<br />
phablet/1234<br />
<br />
* ''' What works, what does not work '''<br />
<br />
* ''' Where/how to report defects '''<br />
<br />
* ''' Contributions '''<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== NixOS ==<br />
http://files.pine64.org/sw/pine64_installer/json/nixos.png<br />
<br />
''NixOS support is handled through the Mobile NixOS project.''<br />
<br />
* [https://mobile.nixos.org/ Project home page]<br />
* [https://github.com/NixOS/mobile-nixos Source code repository]<br />
<br />
There is no pre-built complete image. For now users are expected to follow the instructions in the [https://mobile.nixos.org/getting-started.html Getting Started page], and on [https://mobile.nixos.org/devices/pine64-pinephone-braveheart.html the device's page].<br />
<br />
* ''' What works, what does not work '''<br />
<br />
This information may change, but currently it boots, it's as compatible as the Android-based devices are with Mobile NixOS. It even supports a bit more since it can use Wi-Fi.<br />
<br />
<cite><br />
Support for all of the hardware will be coming, this project is a breadth-first work, where the work spans multiple devices in parallel.<br />
</cite><br />
<br />
* ''' Where/how to report defects '''<br />
On [https://github.com/NixOS/mobile-nixos/issues the project's repository]. Please specify that you are using a Pinephone when reporting issues.<br />
<br />
* ''' Contributions '''<br />
[https://nixos.org/nixos/community.html Details about contributions and donations are on the NixOS website.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== LuneOS ==<br />
http://files.pine64.org/sw/pine64_installer/json/luneos.jpg<br />
Based on WebOS by LG, comes with Luna Next desktop environment.<br />
<br />
* ''' Download location '''<br />
[http://build.webos-ports.org/luneos-testing/images/pinephone/ LuneOS test image for PinePhone]<br />
Tofe recommends using bmaptool ; for example "bmaptool copy http://build.webos-ports.org/luneos-testing/images/pinephone/luneos-dev-image-pinephone-testing-0-15.rootfs.wic.gz /dev/mmcblk0". Rename .wic file to .img for standard dd usage. <br />
<br />
* ''' user-id/password '''<br />
<br />
* ''' What works, what does not work '''<br />
<br />
* ''' Where/how to report defects '''<br />
<br />
* ''' Contributions '''<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Aurora ==<br />
Available soon? https://mobile.twitter.com/neochapay/status/1189552654898188288?p=p<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Android 10 ==<br />
https://www.pine64.org/wp-content/uploads/2020/03/androidpp.jpg<br />
<br />
While I didn't find a download link yet, I did find this image on the March community update from Pine64 of an Android 10 rom running on the Pinephone by [https://github.com/Icenowy Moe Icenowy]. This image is absolutely bare-bones (no applications yet) and comments on IRC indicated it was a theoretical test only, rather than a step towards a release.<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
= Mobile Carrier APN Settings =<br />
<br />
See thread: https://forum.pine64.org/showthread.php?tid=9150<br />
<br />
Disclaimer: Go to the websites of or speak to the customer support lines of those carriers which you want to use. No guarantees. This list is simply user-generated and serves to demonstrate examples of what worked for them. This list is not exhaustive, does not cover all possible carriers.<br />
<br />
''' Distributions with Phosh (Debian + Phosh, pmOS + Phosh, Fedora) '''<br />
<br />
APN settings are either located in <code>Settings > Mobile > Access Point Names</code> (pureOS, Debian + Phosh) or <code>Settings > Network > Network Dropdown > Add new connection</code> (pmOS, Fedora).<br />
<br />
== ATT-based (USA) ==<br />
<br />
* ATT<br />
* Metro<br />
* <br />
<br />
== Red Pocket (USA) ==<br />
<br />
You can choose AT&T, Verizon, T-Mobile or Sprint network. Known to work with the GSMA (ATT) SIM, calls and SMS work.<br />
<br />
APN settings:<br />
<br />
name: Red Pocket<br />
APN: RESELLER<br />
<br />
== Mint Mobile (USA) ==<br />
<br />
Source: https://www.mintmobile.com/setup-for-android/<br />
<br />
Use the following APN settings:<br />
<br />
Name: Ultra<br />
APN: Wholesale<br />
<br />
Call their customer service to activate using the number on their website, or activate on their website: https://my.mintmobile.com/activation. You may also need to reboot your phone.<br />
<br />
== Tracfone (USA?) == <br />
<br />
'''BYOP SIM Card Kit''': works with T-Mobile and AT&T compatible SIM cards provided in BYOP kit<br />
<br />
Calls, SMS, and 3G/4G data known to work with AT&T SIM. (most likely works for T-Mobile as well)<br />
<br />
Use the following APN settings:<br />
<br />
Name: Tracfone<br />
APN: RESELLER<br />
<br />
== Mobile Vikings (Belgium) ==<br />
<br />
Source: https://support.vikingco.com/hc/en-us/articles/202836041-I-don-t-have-any-mobile-internet-What-do-I-do-<br />
<br />
Name: Mobile Vikings<br />
APN: web.be<br />
Username: web<br />
Password: web<br />
<br />
== Virgin Mobile (Canada) ==<br />
<br />
Settings might work with Bell Canada too since it is the same network. Calls, SMS and 4G data appear to be working fine. Note that SIM may well be nano-sim but pinephone is micro-sim and may require an adapter.<br />
<br />
Name: Mobile Fast Web<br />
APN: pda2.bell.ca<br />
Username:<br />
Password:<br />
<br />
== Carriers That Do Not Work ==<br />
<br />
* FreedomPop (USA): VoIP-service. Customer service said they require Android 4.3+, and their free calling and texting works only with the Google Play app they make you use. So calls and texts don't work with non-smart phones and won't work with the PinePhone (even though it is a smart phone) because of software incompatibility. However, the data part still work if APN has been seen correctly set to fp.com.attz. You get 200MB free data per month. However, please watch out that you will get ding by $20 top up charge when over 200MB limit. <br />
* VoLTE services like Sprint or Verizon<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
= Other Resources =<br />
Community<br />
* [https://forum.pine64.org/forumdisplay.php?fid=120 PinePhone Forum]<br />
* [http://www.pine64.xyz:9090/?channels=PINEPHONE PinePhone IRC Channel]<br />
<br />
Hardware information<br />
* [[PinePhone]] hardware details in this Pine64 wiki.<br />
* [[PinePhone_v1.1_-_Braveheart]] hardware details specific to the Braveheart handsets.<br />
* The postmarketOS wiki has a detailed page on the PinePhone hardware [https://wiki.postmarketos.org/wiki/PINE64_PinePhone_(pine64-pinephone) here,] and the preceeding devkit [https://wiki.postmarketos.org/wiki/PINE64_Don%27t_be_evil_devkit_(pine64-dontbeevil) here.]<br />
<br />
Other software information<br />
* [https://linux-sunxi.org/Main_Page sunxi community wiki]<br />
* [https://xnux.eu/devices/pine64-pinephone.html megi feature/driver support matrix]<br />
* [https://megous.com/dl/tmp/README.bootui megi bootUI notes (for dualbooting/multibooting)]<br />
* [https://github.com/ayufan-pine64/boot-tools ayufan boot tools]<br />
<br />
Other<br />
* [https://store.pine64.org/?post_type=product Pine64 shop]<br />
* [https://www.pine64.org/2020/01/24/setting-the-record-straight-pinephone-misconceptions/ Pine64 blog on blobs]<br />
* [https://tuxphones.com/yet-another-librem-5-and-pinephone-linux-smartphone-comparison/ Martijn Braam Librem 5 comparison, especially covering openness/blobs]<br />
* [https://fam-ribbers.com/2019/12/28/State-of-Linux-on-mobile-and-common-misconceptions.html Bart Ribbers blog on linux distributions and desktop environments on mobile devices.]<br />
* [https://www.jeffgeerling.com/blog/2019/a2-class-microsd-cards-offer-no-better-performance-raspberry-pi Jeff Geerling on testing micro SD cards.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div></div>Newton688https://wiki.pine64.org/index.php?title=PinePhone_Software_Releases&diff=5462PinePhone Software Releases2020-04-14T19:03:31Z<p>Newton688: /* Mobile Carrier APN Settings */</p>
<hr />
<div><br />
This page is intended to help you install a software release on your [[PinePhone]]. It also provides details about all available releases as well as links to further resources.<br />
<br />
= General instructions =<br />
<br />
Releases are first installed to a Micro SD card. Choose a card with fast I/O (of small files) for the best performance. See [[#Other Resources]] for performance tests of various SD cards.<br />
<br />
This section has generic installation instructions. Please see the [[#Software Releases]] section for specific installation instructions for each distribution.<br />
<br />
== Boot priority ==<br />
<br />
The default PinePhone boot priority is first the SD card and then the eMMC so inserting your own SD card with your preferred release will result in the phone booting your image. <br />
<br />
User ''megi'' has demonstrated his multi-boot development [https://www.youtube.com/watch?v=ZL1GREqoqx8 on YouTube,] see Other Resouces at the bottom of this page for a link to his notes.<br />
<br />
== Preparation of SD card ==<br />
# Download your chosen image from the options below<br />
# Extract the compressed file<br />
# Write the image to your SD card<br />
# Plug SD card into phone<br />
# Boot phone<br />
If you need step-by-step instructions for writing an image to an SD card, check [[NOOB#Step-by-Step_Instructions_to_Flashing_MicroSD_Cards]] then return to this page.<br />
<br />
== Installation to eMMC (Optional) ==<br />
<br />
=== Method Using Factory Installed Tools ===<br />
<br />
The initial OS you get with your phone has the option to flash an image on SD card to eMMC. <br />
<br />
# Copy (not flash) the image file to a formatted SD card.<br />
# Insert SD card into powered-off phone.<br />
# Turn on phone and select option to install to eMMC.<br />
<br />
=== Safe & Easy Method ===<br />
<br />
[https://forum.pine64.org/showthread.php?tid=9444 Jumpdrive thread]<br />
<br />
# download and extract [https://github.com/dreemurrs-embedded/Jumpdrive/releases the Jumpdrive image]<br />
# flash the Jumpdrive image to a SD card<br />
# boot from the SD card<br />
# connect the PinePhone to your computer using USB-A -> USB-C cable.<br />
# flash the exposed (mounted) PinePhone drive with a chosen OS image as you'd flash any SD card, and resize partitions (optional, see below)<br />
# disconnect the PinePhone from your PC, power it down and remove the Jumpdrive SD card<br />
# boot into your OS of choice on eMMC<br />
<br />
The Jumpdrive image is smaller than 50MB. You can keep an SD card specifically for using Jumpdrive, and there are 64MB micro SD cards sold cheaply that will suffice.<br />
<br />
=== Safe With No Extra Tools, But Slower ===<br />
<br />
# Prepare a formatted SD card, flash desired OS to the SD card, and (optionally) resize the partition (see below)<br />
# Insert SD card and boot the phone<br />
# Open terminal and <code>git clone [url]</code> your desired project OR: Open web browser and download the desired OS image file. <br />
# Build the OS (Optional)<br />
# Flash the resulting image file you got by downloading or by building, to eMMC, using <code>dd if=/dev/mmcblkX of=/dev/mmcblkY bs=1M</code> where X is the number label of the SD card, and Y is the number label of the eMMC. Use the command ''lsblk'' to check your devices: typically with the current kernel the SD card is /dev/mmcblk0 and the eMMC is /dev/mmcblk2 but as always with ''dd'' be extremely cautious to get the devices correct. Then, resize partition to fill up entire disk (see below).<br />
# Turn off phone, remove SD card. Turn on phone.<br />
<br />
=== Risky Method ===<br />
<br />
Warning: This copies a mounted filesystem, which can lead to instability, erratic behavior, and data corruption. Do not use long term.<br />
<br />
# Prepare a new SD card, flash desired OS to the SD card<br />
# Boot the phone with your new SD card image<br />
# Within the booted OS, flash/clone the running OS to eMMC, e.g. using dd. It will take about 15 minutes (depending on the speed of your card), and in the end it may show an error about not enough space - just ignore it.<br />
# Turn off phone, take out SD card, and try booting the phone which should load up the new OS from eMMC.<br />
# Open terminal and resize partition to fill up entire disk (see below).<br />
<br />
== Resize partition to fit disk space ==<br />
<br />
Once you've flashed the OS to your SD card or eMMC storage, you may also need to expand the partition to fill all the available space.<br />
<br />
=== Resize SD card's partition using computer ===<br />
<br />
For SD cards, insert the SD card and resize the partitions through the computer. For eMMC, insert the phone cable and use Jumpdrive to access the eMMC directly, and resize the partition after flashing the image.<br />
<br />
Using Growpart: <br />
<br />
growpart /dev/sdX 1 <br />
resize2fs /dev/sdX 1<br />
<br />
Locate growpart (<code>apt-cache search growpart</code> and install the package in the search results) and run: <br />
growpart /dev/mmcblkX Y<br />
resize2fs /dev/mmcblkXpY<br />
where X is the storage device and Y is the partition number (viewable from lsblk).<br />
<br />
If you get any errors about missing or unknown commands, use apt-cache search to find and install the needed software. Also don't forget to use sudo.<br />
<br />
Using Parted: <br />
<br />
Parted's interactive mode and resize work well together. Do this before you put your SD card into the PinePhone for the first time for best results.<br />
<br />
sudo parted /dev/<your_sd_card_device><br />
(parted) resizepart 2 100%<br />
(parted) quit<br />
sudo resize /dev/<the_second_sd_card_PARTITION><br />
<br />
=== Resize from within PinePhone: ===<br />
<br />
eMMC: you would need to resize the partition on eMMC (flashed with the operating system) by booting another image from the SD card: that way, the eMMC will be unmounted. It is '''not recommended''' to resize eMMC while booted from eMMC! Resizing a currently mounted partition can have weird results. <br />
<br />
SD card: It is generally not possible to boot from eMMC to partition the unmounted SD card, because of the boot order -- you would have to write the image to the empty SD card first, then resize partition, all without rebooting. It is also '''not recommended''' to resize the SD card while booted from SD card! Resizing a currently mounted partition can have weird results.<br />
<br />
== Installing Any ARM64 Distribution ==<br />
'''Warning:''' Distributions not on this page may not even boot after you follow this section. In the best case, they will be barely usable. This is more for fun, or if you would like to port a new distro to the PinePhone.<br />
<br />
'''Note:''' This section uses megi's kernel releases, and not the official ones from PINE64. While it is possible to use the official (and in the future, mainline) kernel, megi provides binary releases, which makes it very easy.<br />
<br />
If you would like to see specific commands for how to complete these steps, see https://github.com/nikhiljha/pp-fedora-sdsetup (an example for Fedora) or https://xnux.eu/howtos/install-arch-linux-arm.html (an example for Arch Linux).<br />
<br />
# Create a boot (from 4MiB to about 252MiB) and root (from 252 MiB to the end of the card) filesystem on the SD card.<br />
# Format the boot partition with vfat, and the root partition with f2fs.<br />
# Extract the root filesystem from your distro's ARM image into the root filesystem on the sd card. Do not copy the partition, copy the files (in archive mode: like rsync -ar).<br />
# Edit /etc/fstab to match your partitions.<br />
# Grab megi's kernel from https://xff.cz/kernels/ (you probably want 5.6).<br />
# Follow the README instructions, which involves copying the kernel modules into the sd card rootfs and writing u-boot and the bootloader.<br />
<br />
== Backlight ==<br />
All current distributions do not have a good setup for the backlight at low brightness.<br />
If configured too low, the backlight shuts down completely, but the screen is still displayed and usable in bright front-light.<br />
<br />
Sailfish is one OS that initially uses automatic backlight control and the default setting makes the screen appear blank. When shining a bright light on the screen, you can still navigate the screen (and maybe the screen switches on temporarily due to the light sensor). This will make it possible for you to disable auto brightness in Settings, Display.<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
= Software Releases =<br />
<br />
== postmarketOS ==<br />
http://files.pine64.org/sw/pine64_installer/json/postmarketos.png<br />
<br />
postmarketOS is a preconfigured version of [https://www.alpinelinux.org/ Alpine Linux] for mobile devices that offers a choice of several desktop environments including Plasma Mobile and phosh. <br />
<br />
* ''' Download location '''<br />
Rather than downloading a demo image [https://wiki.postmarketos.org/wiki/Installation_guide postmarketOS recommend] the use of their script, pmbootstrap, that can tailor build your SD card for you. See for example [https://forum.pine64.org/showthread.php?tid=8285 this forum thread.] <br />
<br />
Note pmbootstrap offers an option [https://wiki.postmarketos.org/wiki/PINE64_PinePhone_(pine64-pinephone)#Installation to install to the eMMC.]<br />
<br />
[http://images.postmarketos.org/pinephone/ Demo images can be found here.]<br />
<br />
* ''' user-id/password '''<br />
demo/147147 (for demo images only - when building an image with pmbootstrap you set your own user-id and password. '''NOTE: currently some lock screens require your password, but only present a numeric keyboard. So you should use [[only]] numbers in your password until you've verified you can unlock with other characters.''')<br />
<br />
* ''' What works, what does not work '''<br />
See [https://wiki.postmarketos.org/wiki/PINE64_PinePhone_(pine64-pinephone) postmarketOS dedicated PinePhone wiki page]<br />
<br />
If you install firefox browser (or are using a demo image that already has it installed) [https://wiki.postmarketos.org/wiki/Firefox then these hints in the pmOS wiki are recommended.] GDK_SCALE=1 is best for PinePhone screen, enable Wayland gets application to fit screen and allow keyboard entry.<br />
<br />
* ''' Where/how to report defects '''<br />
[https://gitlab.com/postmarketOS/postmarketos/issues/3 postmarketOS issue tracker for PinePhone support]<br />
<br />
* ''' Contributions '''<br />
[https://wiki.postmarketos.org/wiki/Contributing See postmarketOS wiki for options to contribute.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Ubuntu Touch by UBPorts ==<br />
http://files.pine64.org/sw/pine64_installer/json/ubports.png<br />
A Mobile Version of the Ubuntu Operating System made and maintained by the UBports Community.<br />
<br />
A short, state-of-the-art (as at 2 April 2020) [https://youtu.be/3Ne6G0-hn9g demo on YouTube.]<br />
<br />
[https://ubuntu-touch.io/ Ubuntu touch] is a mobile version of Ubuntu developed by the UBports community. Images can be downloaded from [https://ci.ubports.com/job/rootfs/job/rootfs-pinephone/ here]. There is also a [https://github.com/goddard/pinephone/ script] to download the latest img and flash to your pinephone. In the future, Ubuntu Touch will be able to be installed onto the PinePhone with the [https://ubuntu-touch.io/get-ut UBports installer] GUI tool.<br />
<br />
* ''' Download location '''<br />
[https://gitlab.com/ubports/community-ports/pinephone See UBports gitlab page.]<br />
<br />
* ''' user-id/password '''<br />
The default password is <code>phablet</code><br />
<br />
* ''' What works, what does not work '''<br />
[https://gitlab.com/ubports/community-ports/pinephone Scroll down to the bottom of this page.]<br />
<br />
* ''' Where/how to report defects '''<br />
[https://gitlab.com/ubports/community-ports/pinephone See UBports gitlab page.]<br />
<br />
* ''' Contributions '''<br />
[https://ubports.com/foundation/sponsors See UBports website for how to donate.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Debian-PinePhone ==<br />
[[File:Debian-logo.png]]<br />
<br />
See [https://forum.pine64.org/showthread.php?tid=9016 this thread in the forum.]<br />
<br />
An unofficial Debian build for ARM64 running with the [https://developer.puri.sm/Librem5/Software_Reference/Environments/Phosh.html phosh user interface] (developed by Purism, phosh uses [https://en.wikipedia.org/wiki/Wayland_(display_server_protocol) Wayland] instead of [https://en.wikipedia.org/wiki/X.Org_Server Xorg]). The base system is pure Debian, with only the GUI applications and a few others (ModemManager, Wifi chip firmware) being built from modified sources (as well as the kernel and u-boot, of course). Current version is Debian Bullseye. <br />
<br />
* ''' Download location '''<br />
[http://pinephone.a-wai.com/images/ Debian-pinephone downloadable images]<br />
<br />
Don't forget to extract the image before installing: <br />
$ gzip -d Downloads/debian-pinephone-*.img.gz<br />
<br />
See the [https://gitlab.com/a-wai/debos-pinephone project page] for specific installation instructions.<br />
<br />
* ''' user-id/password '''<br />
debian/1234<br />
<br />
* ''' What works, what does not work '''<br />
<br />
See [https://gitlab.com/a-wai/debos-pinephone/-/blob/master/README.md the project's README file] for most up to date status.<br />
<br />
Check [https://gitlab.com/a-wai/debos-pinephone/-/issues bug tracker] for known issues. Questions? Ask on our [https://forum.pine64.org/showthread.php?tid=9016 thread].<br />
<br />
* ''' Where/how to report defects '''<br />
It is recommended that you log your bug reports in [https://gitlab.com/a-wai/debos-pinephone/issues/ the project's issue tracker.] As a general rule, issues with third-party apps (even the default ones) should be reported upstream. A Debian-PinePhone issue would be related to getting the hardware to work on the PinePhone, but if unsure where the issue should be reported just open a ticket or ask.<br />
<br />
* ''' Contributions '''<br />
Feel free to pick an open issue to work on, or send a merge request on [https://gitlab.com/a-wai/debos-pinephone/ Gitlab.]<br />
<br />
* ''' User Experience Notes '''<br />
<br />
If not already mentioned on the project page, the [https://forum.pine64.org/showthread.php?tid=9016 thread] might have known workarounds to software and user experience issues as contributed by the users.<br />
<br />
'''The Chatty app''' requires that in order to start a new text, you need to enter +[country code]-[phone number]. Without the + and the country code (+1 for USA) you won't be able to send a new text. <br />
<br />
'''To adjust screen resolution''' [https://puri.sm/posts/easy-librem-5-app-development-scale-the-screen/] [https://forum.pine64.org/showthread.php?tid=9016&pid=61403#pid61403] [https://forum.pine64.org/showthread.php?tid=9016&pid=61685#pid61685] <br />
<br />
# <code>sudo apt install linux-libc-dev build-essential ninja-build meson cmake libwayland-dev</code><br />
# Continue the rest of the instructions on [https://puri.sm/posts/easy-librem-5-app-development-scale-the-screen/ this page]<br />
# When you finish, you will have a touch-capable app you can use to adjust resolution any time, useful when switching between various apps. Unlike other solutions, this works across reboots.<br />
<br />
'''Most of Debian's repository is available.''' There are packages that apt won't find, which need to be cross compiled ("ported") to ARM64 (see [https://wiki.debian.org/Arm64Port Debian's wiki on ARM64 port]), but the process is fairly easy. Most developers package their software for the AMD64 version of Debian, so they will throw an error when run; if you have the source code, you can compile it to run on ARM64/PinePhone. If you do so, you should contact the developers so they can provide precompiled ARM64 packages for others in the future. You should also contact Debian if you have working ARM64 packages not listed on [https://wiki.debian.org/Arm64Port this page], since this helps them track the status of ARM64 with Debian. Give their wiki page some TLC.<br />
<br />
'''Apps that don't work with Wayland''': if you encounter an app that only works with X11 and not Wayland, report it upstream to the app's developers.<br />
<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== SailfishOS ==<br />
http://files.pine64.org/sw/pine64_installer/json/sailfishos.png<br />
Sailfish OS is a Linux-based operating system based on open source projects such as Mer and including a closed source UI.<br />
<br />
* ''' Download location '''<br />
The SailfishOS image is built on Gitlab CI. The latest image can be installed using the [https://raw.githubusercontent.com/sailfish-on-dontbeevil/flash-it/master/flash-it.sh flashing script].<br />
<br />
The script downloads the image and bootloader from our CI, extracts everything and burns it onto the SD card. '''Note:''' The script will format and erase the SD card!<br />
<br />
''Instructions:''<br />
# Download the flashing script<br />
# Insert a microSD card in your device<br />
# Make the script executable: <code>chmod +x flash-it.sh</code><br />
# Verify that you have the <code>bsdtar</code> package installed<br />
# Execute it: <code>./flash-it.sh</code><br />
# Follow the instructions. Some commands in the script require root permissions (for example: mounting and flashing the SD card).<br />
<br />
Note that after baking µSD card and booting phone, as per [https://www.reddit.com/r/pinephone/comments/f1l7bm/sailfish_os_on_pinephone_best_os_so_far_in_my/fh8o0s2/ Reddit comment] you have to adjust autobrightness settings in order to actually see interface.<br />
<br />
* ''' user-id/password '''<br />
<br />
* ''' What works, what does not work '''<br />
The current (9 Apr) version of Sailfish no longer has a defect with the auto brightness! To install on an sdcard you need to have a sim card in the phone - doesn't have to be an active one. If you then unset the device lock password in settings you can take the simcard out. If you're not familiar with SFOS, be prepared by having a (free) Jolla account and pay attention to the tutorial - the interface works great but isn't immediately obvious. If you are familiar with SF you can skip the tutorial by touching all 4 corners starting top left. There is a poor selection of apps available from the Jolla store. To enable openrepos.net install the Storeman app by downloading the rpm from https://openrepos.net/content/osetr/storeman. You may need to get the rpm onto the phone by other means as the built in browser is not working at the moment. Press the rpm file and you will be asked to install it. Once you have the Storeman app installed browse the store and add repository first from the pulley menu before installing the app you want. The Webcat browser on openstore works.<br />
<br />
* ''' Where/how to report defects '''<br />
<br />
See [https://sailfishos.org/wiki/Collaborative_Development#Reporting_issues the Sailfish wiki] for links to their forum as well as info required when reporting an issue.<br />
<br />
* ''' Contributions '''<br />
[https://sailfishos.org/wiki/SailfishOS See the SailfishOS wiki for options to contribute.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
<br />
== PureOS ==<br />
PureOS is a GNU/Linux distribution focusing on privacy and security, using the GNOME desktop environment. It is developed and maintained by Purism.<br />
<br />
* ''' Download location '''<br />
This is an unofficial/unsupported creation by ''mozzwald'' that can be downloaded [http://pureos.ironrobin.net/droppy/#/Images here.] <br />
<br />
* ''' user-id/password '''<br />
purism/123456<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Fedora ==<br />
http://files.pine64.org/sw/pine64_installer/json/fedora.png<br />
<br />
An (unofficial) vanilla Fedora rawhide build for aarch64 with megi's kernel and [https://copr.fedorainfracloud.org/coprs/njha/mobile/packages/ some additional packages] to tie it all together. It aims to eventually be an upstream part of the Fedora project, rather than a phone-specific distribution.<br />
<br />
See [https://forum.pine64.org/showthread.php?tid=9347 this thread in the forum.]<br />
<br />
* ''' Download location '''<br />
[https://github.com/nikhiljha/pp-fedora-sdsetup/releases/ flashable images] or [https://github.com/nikhiljha/pp-fedora-sdsetup/tree/image build scripts]<br />
<br />
The images are compressed with zstd because the maintainer needs an excuse to use zstd.<br />
<br />
* ''' user-id/password '''<br />
pine/1111<br />
<br />
* ''' What works, what does not work '''<br />
* WiFi, Bluetooth, SMS, Data, Calls all work!<br />
* There are still a few bugs though, and [some features don't have driver support yet https://xnux.eu/devices/pine64-pinephone.html#toc-feature-driver-support-matrix] on any PinePhone distribution. <br />
<br />
* ''' Where/how to report defects '''<br />
Please send your bug reports at [https://github.com/nikhiljha/pp-fedora-sdsetup/issues the project's issue tracker.] Be sure to include logs if applicable!<br />
<br />
* ''' Contributions '''<br />
Please help! Send us merge requests on [https://github.com/nikhiljha/pp-fedora-sdsetup/ Github.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Arch Linux ARM ==<br />
http://files.pine64.org/sw/pine64_installer/json/archlinux.png<br />
<br />
An (unofficial) barebone Arch Linux ARM image, all you have is just a shell and SSH.<br />
<br />
* ''' Download location '''<br />
https://github.com/dreemurrs-embedded/Pine64-Arch/releases<br />
<br />
* ''' user-id/password '''<br />
alarm/alarm<br />
<br />
* ''' What works, what does not work '''<br />
It's just fast and smooth, there's nothing, you'll have to install a desktop at your own. GNOME is a good example to look at.<br />
<br />
To access the device, ssh to 172.16.42.1 with the credentials above. <br />
<br />
* ''' Contributions '''<br />
Feel free to send us merge requests on [https://github.com/dreemurrs-embedded/Pine64-Arch/pulls GitHub.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Manjaro ARM ==<br />
http://files.pine64.org/sw/pine64_installer/json/manjaro.png<br />
Manjaro is a user-friendly Linux distribution based on the independently developed Arch operating system with the Plasma Mobile desktop environment.<br />
<br />
* ''' Download location '''<br />
[https://forum.manjaro.org/t/manjaro-arm-alpha4-for-pinephone-and-pinetab/127684 See Manjaro forum announcement of Alpha4 version]<br />
<br />
* ''' user-id/password '''<br />
** manjaro/1234<br />
** root/root<br />
<br />
* ''' What works, what does not work '''<br />
[https://forum.manjaro.org/t/manjaro-arm-alpha4-for-pinephone-and-pinetab/127684 See Manjaro announcement.]<br />
<br />
In particular phone calls do NOT yet work from the Phone application.<br />
<br />
* ''' Where/how to report defects '''<br />
<br />
* ''' Contributions '''<br />
[https://forum.manjaro.org/t/manjaro-arm-alpha4-for-pinephone-and-pinetab/127684 See the end of the announcement here.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Maemo Leste ==<br />
http://files.pine64.org/sw/pine64_installer/json/maemo_leste.png<br />
<br />
[https://en.wikipedia.org/wiki/Maemo Maemo] is a trimmed-down version of Debian for mobile devices, originally a collaboration between Nokia and many open source projects (the [http://maemo.org/intro/ Maemo community]) before Nokia abandoned it. The community now takes full responsibility in developing fully open source Maemo for a variety of mobile devices. <br />
<br />
The new version Maemo 7 "Leste" is an ARM64 port of [https://devuan.org/ Devuan] (Debian without systemd) and runs the mainline Linux kernel. The default user interface stack is [https://en.wikipedia.org/wiki/Hildon Hildon], [https://en.wikipedia.org/wiki/X.Org_Server Xorg], [https://en.wikipedia.org/wiki/Matchbox_(window_manager) Matchbox WM], and [https://en.wikipedia.org/wiki/GTK GTK]. The current version is Devuan Ascii (Debian Stretch) and they are working on an upgrade to Devuan Beowulf (Debian Buster) as well as simultaneous support for both Devuan and Debian. In addition to the main repository, they [https://maemo-leste.github.io/maemo-leste-repositories-and-community-packages.html announced] a [https://github.com/maemo-leste-extras/bugtracker community repository]. To keep updated they use automation in their package maintenance with [https://github.com/maemo-leste/jenkins-integration jenkins] (similar to [https://www.debian.org/devel/buildd/ debian's buildd]). Porting packages to Maemo Leste is basically a simple matter of porting to arm64 version of Debian/Devuan, which benefits both projects.<br />
<br />
More detailed information can be found on [https://leste.maemo.org/Main_Page the Maemo Leste wiki], or follow [https://maemo-leste.github.io/ announcements on their website], and check out [https://leste.maemo.org/Leste_FAQ Frequently Asked Questions]. <br />
<br />
* ''' Download location '''<br />
[http://maedevu.maemo.org/images/pinephone/ Maemo Leste test builds.] There is also an [https://github.com/maemo-leste/image-builder image builder], see their wiki for instructions on how to [https://leste.maemo.org/Image_Builder build a custom image].<br />
<br />
* ''' user-id/password '''<br />
root/toor<br />
<br />
You may use "sudo" directly.<br />
<br />
* ''' What works, what does not work '''<br />
For current status and work arounds please read their [https://leste.maemo.org/PinePhone PinePhone wiki page], and update as necessary (make sure to notify them of new issues by leaving a report on their github, see below).<br />
<br />
* ''' Where to Report Issues '''<br />
Most discussion occurs at #maemo-leste on freenode IRC. The Maemo website also has an [https://talk.maemo.org/showthread.php?p=1565822 ongoing forum thread] for feedback about Maemo Leste on the PinePhone BraveHeart edition.<br />
<br />
All other contact information is listed on the [https://leste.maemo.org/Main_Page main page] of the Maemo wiki. You should [https://github.com/maemo-leste/bugtracker/issues submit bug reports] on github. To track known issues, you may use these search terms: [https://github.com/maemo-leste/bugtracker/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen+pinephone pinephone], [https://github.com/maemo-leste/bugtracker/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen+pine64 pine64]<br />
<br />
* ''' Development '''<br />
Learn about [https://leste.maemo.org/Development development], [https://leste.maemo.org/Development/Porting_Packages porting packages], [https://leste.maemo.org/Development/Building_Packages building packages], [https://leste.maemo.org/Development/Tasks todo list], and general info on [https://wiki.debian.org/HowToPackageForDebian how to package for Debian]. Some tasks have funding available. <br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Nemo Mobile ==<br />
http://files.pine64.org/sw/pine64_installer/json/nemo_mobile.png<br />
Nemo Mobile is the open source build of Sailfish OS.<br />
<br />
See [https://forum.pine64.org/showthread.php?tid=9043 this forum thread] for how to get going.<br />
<br />
* ''' Download location '''<br />
[https://github.com/neochapay/nemo-device-dont_be_evil/ Download location is here on GitHub.]<br />
<br />
* ''' user-id/password '''<br />
<br />
* ''' What works, what does not work '''<br />
[https://github.com/neochapay/nemo-device-dont_be_evil/ Scroll down the page here.]<br />
<br />
* ''' Where/how to report defects '''<br />
For more info please visit [https://github.com/neochapay/nemo-device-dont_be_evil neochapay's github page]<br />
<br />
* ''' Contributions '''<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== KDE Neon ==<br />
http://files.pine64.org/sw/pine64_installer/json/plasma_mobile.png<br />
Based on KDE Neon for the desktop, comes with Plasma Mobile.<br />
<br />
* ''' Download location '''<br />
[https://images.plasma-mobile.org/pinephone/ Plasma mobile images can be found here.]<br />
<br />
* ''' user-id/password '''<br />
phablet/1234<br />
<br />
* ''' What works, what does not work '''<br />
<br />
* ''' Where/how to report defects '''<br />
<br />
* ''' Contributions '''<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== NixOS ==<br />
http://files.pine64.org/sw/pine64_installer/json/nixos.png<br />
<br />
''NixOS support is handled through the Mobile NixOS project.''<br />
<br />
* [https://mobile.nixos.org/ Project home page]<br />
* [https://github.com/NixOS/mobile-nixos Source code repository]<br />
<br />
There is no pre-built complete image. For now users are expected to follow the instructions in the [https://mobile.nixos.org/getting-started.html Getting Started page], and on [https://mobile.nixos.org/devices/pine64-pinephone-braveheart.html the device's page].<br />
<br />
* ''' What works, what does not work '''<br />
<br />
This information may change, but currently it boots, it's as compatible as the Android-based devices are with Mobile NixOS. It even supports a bit more since it can use Wi-Fi.<br />
<br />
<cite><br />
Support for all of the hardware will be coming, this project is a breadth-first work, where the work spans multiple devices in parallel.<br />
</cite><br />
<br />
* ''' Where/how to report defects '''<br />
On [https://github.com/NixOS/mobile-nixos/issues the project's repository]. Please specify that you are using a Pinephone when reporting issues.<br />
<br />
* ''' Contributions '''<br />
[https://nixos.org/nixos/community.html Details about contributions and donations are on the NixOS website.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== LuneOS ==<br />
http://files.pine64.org/sw/pine64_installer/json/luneos.jpg<br />
Based on WebOS by LG, comes with Luna Next desktop environment.<br />
<br />
* ''' Download location '''<br />
[http://build.webos-ports.org/luneos-testing/images/pinephone/ LuneOS test image for PinePhone]<br />
Tofe recommends using bmaptool ; for example "bmaptool copy http://build.webos-ports.org/luneos-testing/images/pinephone/luneos-dev-image-pinephone-testing-0-15.rootfs.wic.gz /dev/mmcblk0". Rename .wic file to .img for standard dd usage. <br />
<br />
* ''' user-id/password '''<br />
<br />
* ''' What works, what does not work '''<br />
<br />
* ''' Where/how to report defects '''<br />
<br />
* ''' Contributions '''<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Aurora ==<br />
Available soon? https://mobile.twitter.com/neochapay/status/1189552654898188288?p=p<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
== Android 10 ==<br />
https://www.pine64.org/wp-content/uploads/2020/03/androidpp.jpg<br />
<br />
While I didn't find a download link yet, I did find this image on the March community update from Pine64 of an Android 10 rom running on the Pinephone by [https://github.com/Icenowy Moe Icenowy]. This image is absolutely bare-bones (no applications yet) and comments on IRC indicated it was a theoretical test only, rather than a step towards a release.<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
= Mobile Carrier APN Settings =<br />
<br />
See thread: https://forum.pine64.org/showthread.php?tid=9150<br />
<br />
Disclaimer: Go to the websites of or speak to the customer support lines of those carriers which you want to use. No guarantees. This list is simply user-generated and serves to demonstrate examples of what worked for them. This list is not exhaustive, does not cover all possible carriers.<br />
<br />
''' Distributions with Phosh (Debian + Phosh, pmOS + Phosh, Fedora) '''<br />
<br />
APN settings are either located in <code>Settings > Mobile > Access Point Names</code> (pureOS, Debian + Phosh) or <code>Settings > Network > Network Dropdown > Add new connection</code> (pmOS, Fedora).<br />
<br />
== ATT-based (USA) ==<br />
<br />
* ATT<br />
* Metro<br />
* <br />
<br />
== Red Pocket (USA) ==<br />
<br />
You can choose AT&T, Verizon, T-Mobile or Sprint network. Known to work with the GSMA (ATT) SIM, calls and SMS work.<br />
<br />
APN settings:<br />
<br />
name: Red Pocket<br />
APN: RESELLER<br />
<br />
== Mint Mobile (USA) ==<br />
<br />
Source: https://www.mintmobile.com/setup-for-android/<br />
<br />
Use the following APN settings:<br />
<br />
Name: Ultra<br />
APN: Wholesale<br />
<br />
Call their customer service to activate using the number on their website, or activate on their website: https://my.mintmobile.com/activation. You may also need to reboot your phone.<br />
<br />
== Tracfone (USA?) == <br />
<br />
'''BYOP SIM Card Kit''': works with T-Mobile and AT&T compatible SIM cards provided in BYOP kit<br />
<br />
Calls, SMS, and 3G/4G data known to work with AT&T SIM. (most likely works for T-Mobile as well)<br />
<br />
Use the following APN settings:<br />
<br />
Name: Tracfone<br />
APN: RESELLER<br />
<br />
== Mobile Vikings (Belgium) ==<br />
<br />
Source: https://support.vikingco.com/hc/en-us/articles/202836041-I-don-t-have-any-mobile-internet-What-do-I-do-<br />
<br />
Name: Mobile Vikings<br />
APN: web.be<br />
Username: web<br />
Password: web<br />
<br />
== Virgin Mobile (Canada) ==<br />
<br />
Settings might work with Bell Canada too since it is the same network. Calls, SMS and 4G data appear to be working fine. Note that SIM may well be nano-sim but pinephone is micro-sim and may require an adapter.<br />
<br />
Name: Mobile Fast Web<br />
APN: pda2.bell.ca<br />
Username:<br />
Password:<br />
<br />
== Carriers That Do Not Work ==<br />
<br />
* FreedomPop (USA): VoIP-service. Customer service said they require Android 4.3+, and their free calling and texting works only with the Google Play app they make you use. So calls and texts don't work with non-smart phones and won't work with the PinePhone (even though it is a smart phone) because of software incompatibility. However, the data part still work if APN has been seen correctly set to fp.com.attz. You get 200MB free data per month. However, please watch out that you will get ding by $20 top up charge when over 200MB limit. <br />
* VoLTE services like Sprint or Verizon<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div><br />
<br />
= Other Resources =<br />
Community<br />
* [https://forum.pine64.org/forumdisplay.php?fid=120 PinePhone Forum]<br />
* [http://www.pine64.xyz:9090/?channels=PINEPHONE PinePhone IRC Channel]<br />
<br />
Hardware information<br />
* [[PinePhone]] hardware details in this Pine64 wiki.<br />
* [[PinePhone_v1.1_-_Braveheart]] hardware details specific to the Braveheart handsets.<br />
* The postmarketOS wiki has a detailed page on the PinePhone hardware [https://wiki.postmarketos.org/wiki/PINE64_PinePhone_(pine64-pinephone) here,] and the preceeding devkit [https://wiki.postmarketos.org/wiki/PINE64_Don%27t_be_evil_devkit_(pine64-dontbeevil) here.]<br />
<br />
Other software information<br />
* [https://linux-sunxi.org/Main_Page sunxi community wiki]<br />
* [https://xnux.eu/devices/pine64-pinephone.html megi feature/driver support matrix]<br />
* [https://megous.com/dl/tmp/README.bootui megi bootUI notes (for dualbooting/multibooting)]<br />
* [https://github.com/ayufan-pine64/boot-tools ayufan boot tools]<br />
<br />
Other<br />
* [https://store.pine64.org/?post_type=product Pine64 shop]<br />
* [https://www.pine64.org/2020/01/24/setting-the-record-straight-pinephone-misconceptions/ Pine64 blog on blobs]<br />
* [https://tuxphones.com/yet-another-librem-5-and-pinephone-linux-smartphone-comparison/ Martijn Braam Librem 5 comparison, especially covering openness/blobs]<br />
* [https://fam-ribbers.com/2019/12/28/State-of-Linux-on-mobile-and-common-misconceptions.html Bart Ribbers blog on linux distributions and desktop environments on mobile devices.]<br />
* [https://www.jeffgeerling.com/blog/2019/a2-class-microsd-cards-offer-no-better-performance-raspberry-pi Jeff Geerling on testing micro SD cards.]<br />
<br />
<div class="center" style="background-color: lightyellow;">[[#top | '''Return to top of page''']]</div></div>Newton688