In this blog post I would like take you through an introduction to Mycroft GUI Skills and Voice Applications technology on Plasma Bigscreen and showcase some of the interesting stuff I have been working on for the Plasma Bigscreen Project which are available on the beta image release for the Raspberry PI 4. This beta image show cases not only media-rich voice applications but also applications specialised to fit the Bigscreen experience all developed under an open process, more information on them in the sections below.
Plasma Bigscreen is the free open-source user interface experience for those big TV Screens, It consist of KDE Plasma technology powering the User Interface with Mycroft AI’s voice assistance technology packaged together on the image to provide a Smart TV platform and is based on KDE Neon.
The experience when sitting 10 feet away from your TV just isn’t complete without having the ease of access to control it and that’s exactly the space in which Mycroft AI the open-source voice assistant experience fits right in to provide you with that hands free easy interaction.
Discover Mycroft AI GUI Skills
Mycroft AI skills allow the assistant to learn and perform different tasks. A weather skill, for example, allows Mycroft to discover the weather and tell you what the weather on a day is going to be like; a cooking skill retrieves recipes and instructions and you can then ask Mycroft to help you make a delicious meal. There are already many voice skills available in the Mycroft skill repository that you can explore and more being quickly developed everyday.
Mycroft AI’s graphical framework for skills is built on top of Qt and Kirigami, two mature development frameworks. This allows developers to use Python and QML to develop rich voice skills with graphical user interfaces for multiple platforms, Voice Applications featured on Plasma Bigscreen beta image are based on this combination of both voice and display technologies, where we expand Mycroft Skills with a GUI to also work as applications that can be controlled voice and physical key interaction.
A Look Into Voice Application
Voice can be a very powerful tool for interacting with applications, it can complement an application by reducing the number of actions required to execute a certain task by the user be it searching for a file or searching for music, simply asking an application for example to play a song can make things a lot simpler than opening an application, hitting the search tab / field, typing a song and hitting the play button.
Voice applications on the beta plasma bigscreen image are designed to be simple and powerful to use no matter what method is chosen to interact with them, let’s look at the Youtube Voice Application for instance.
The Youtube Voice Application is a GUI based skill for Mycroft AI, All of its logical functions are handled and called within its skill class, Adding a desktop entry and icon for the gui based skill is what partially turns the GUI skill into a Voice Application, the other bits include simply adding a landing page or the “homescreen” and registering it in the skill class using the gui event handler that is first presented to users when they execute the skill from the desktop entry, some more in-depth information on installation and this architecture can be read in the Voice Application Guidelines.
Simple Starting Point For Key Based Navigation in Mycroft Skills With A GUI Interface the “Homescreen”
The homescreen of a voice application can be considered the traditional equivalent of a home tab on a normal application, in the case of the Youtube Voice Application it consist of showing several categories of available videos, your recently watched history and search, It is presented to be navigable by simple arrow keys for selecting and browsing videos from the various video categories. The homescreen can be a simple page showcasing Mycroft Examples or as complex as the skill author requires the page to be as is the case above.
The API documentation for developing Voice and GUI based skills and converting them to Voice Applications for Plasma Bigscreen is available on the following links below:
- Mycroft Skills Development Documentation: https://mycroft-ai.gitbook.io/docs/skill-development/introduction
- Mycroft GUI Documentation: https://mycroft-ai.gitbook.io/docs/skill-development/displaying-information/mycroft-gui
- Voice Application Guidelines: https://plasma-bigscreen.org/development-guidelines
Aura Browser – Designed To Be Controlled By Just Your Remote
I would like to introduce you to the Aura Browser, It is a new browser based on QtWebEngine that I had the opportunity to work on for the Plasma Bigscreen beta image, It is completely designed to work with simple arrow key navigation that complements browsing the web with just a Remote Control without the requirement of a physical mouse. It features a virtual mouse controlled by arrow keys with auto scrolling, has support for tab based browsing, basic bookmarks and basic downloads and permission management.
This browser in its early stages but is available on the Plasma Bigscreen beta image release for the Raspberry Pi 4. The browser source can be found at: https://invent.kde.org/adityam/aura-browser