I got an idea about screencasting app for Sailfish OS last week. I decided to try how easily it can be done and came up with SailCast. It took about 4 hours including some testing to get it working. The app is far from perfect and it doesn't perform very well for a couple of reasons of which I'm going tell more in this post.
The app isn't very complicated. It consists of single
ScreenProvider class which inherits
QTcpSocket. When user pushes the start button on the UI, the server starts listening for incoming connections. The server accepts only one connection at a time due to performance reasons.
When a connection is detected, it's passed to overridden
incomingConnection method, which will in turn start another thread from which the client is served. The thread is basically a loop which consumes
QByteArrays (= images) from
QQueue. The data is written to the socket with special http
multipart/x-mixed-replace Content-Type. This means that there's a boundary string between each frame so that the client can detect separate frames.
The video frames are single screenshots obtained via
dbus with a timer which fires by default every 140 ms. The screenshot API is provided by Lipstick's
ScreenshotService. During my testing I noticed that taking a single jpeg screenshot takes about 90-100 ms so the maximum framerate which can be achieved with this method is about 10 fps. In addition to that taking a screenshot will block some time. This means that the UI will become very laggy if the service is polled faster than 140 ms.
The stream should work at least with Firefox and VLC. Chrome requires probably a html element like
<img/> which wraps the stream because Chrome doesn't show the stream from the plain stream address. Installers for both phone and tablet can be found from GitHub.