Some Mycroft A.I. Ideas I developed over time
It has been a while now since I came into contact with Mycroft A.I. Till then I have been playing around with it and started the blog post about how to create your own Personal A.I. Assistant blog series. I will shortly continue with that blog series, updating the content with the latest and greatest development from the Mycroft guys.
Anyhow, while working on the project and different aspects, new ideas pop up in your head. This blog post is to ventilate those ideas. Either to inspire others for their ideas or for others to pick one and work on it in advanced. Please reach out to me if you want to discuss any of these ideas in more detail (if it is possible / available of course)
ReSpeaker Enclosure Skill
The ReSpeaker circular product line have a nice ring of LED’s. For our DiY Personal A.I. Assistant we use the 4-Mic array that contains 12 APA102 programable LEDs. However recently the new 6-Mic array has been released as well with similar LED’s and control.
Similar as with the Alexa and Google Home and even the Mark-1 devices these LED’s can be used to visualize the different statuses of Mycroft. Listening, Thinking, Speaking, etc.
The mark-1 has an enclosure skill that controls the front mouth and eye led’s. The idea is to take this as starting point and convert that skill to use the LED drivers and code from Seeed; https://github.com/respeaker/mic_array
ReSpeaker Audio Processor
Make use of all the capabilities of the ReSpeaker MIC-Array. DOA, Beam Forming, AEC, etc as described on the seeed / respeaker wiki;
We can write code as possible Pulseaudio plugin to make full use of all the MIC’s and hardware of the ReSpeaker board. https://respeaker.io/make_a_smart_speaker/ is a good starting point to check and see what audio processes can be used. Acoustic Echo Cancellation (AEC) is already available in a small show case project here; https://github.com/voice-engine/ec
Google Assistant / Alexa Fallback Skill
A small idea is to implement the Assistant and Alexa within our Personal A.I. Assistant project. As we already have everything setup for Mycroft it is only a small step to install and configure both Alexa and Google Assistant onto the same device.
From there it is a small step to create a fallback skill for Mycroft that enables to resend / forward the question from the user in case Mycroft cannot answer based on the skills installed. If Mycroft doesn’t know the answer, the fallback skill could be initiated by;
“Hey Mycroft, ask google”
“Hey Mycroft, does Alexa know”
The last question is then forwarded to either GA or Alexa and let them respond to the user.
TTS Listening sound
As soon as the wake word is heard a small WAV file is played confirming to the user that listening mode is enabled. It would be cool to make this configurable and make use of mimic/mimic2. So instead of the small beep you could for instance let Mycroft say a word(s) such as; “Yes” or “Yes, master” “Yes, Peter”.
This gives it a more personal feeling / touch. It can already been done by just initiating mimic from the command line and output the response to a WAV file. ./mimic –t “Yes, peter” –o listening.wav
VOIP (and/or XMMP) Support
Not much to say here as the idea itself is rather self-explainable. Some sort of VOIP type of skill so we can call and receive calls using Mycroft.
WPE Webkit browser as default HDMI interface
Again not much to say here. Adding support for the Dutch language. I haven’t really started on this as things changing rather quickly and the Mozilla guys are already accepting data. I think this will be one of those little projects that you work on whenever you get stuck code-wise and just need some easy tasks to do.
MycroftOS – Bare minimal Buildroot OS
This idea on itself is a whole new and separate project on this website and currently being worked on. The first blog post has been made a little while ago. More information can be found on my github account; https://github.com/j1nx/MycroftOS/
In short: MycroftOS is a bare minimal OS based on Buildroot to run the Mycroft A.I. software stack on embedded devices such as the Raspberry Pi’s. Perhaps in the future an alternative for Picroft and/or the Mark-1 device.
Gideon wake word training
As a HUGE fan of the Marvel Television series; The Flash it would be so cool to have “Gideon as wake up word. This can already been done using Pocketsphinx of course, but the idea here is to either train it myself using precise or vote to have it being trained as one of the next wake words by the Mycroft people. Combine this with the new Avatar stuff from TREE-Industries to make it even more appealing.
Events (like voicemail) skill
You are not always at home, however your Personal A.I. Assistant is (at least most of the time and for now) If you have connected you home assistant or other automation system, there might have happened a lot of things during your absence. Some more important than others, however it would be cool to have important messages queued and been played as soon as you get home. “Hey Mycroft, what did I miss?”
This could be extended with presence detection to even automate it and have Mycroft informing you about the REALLY important messages in case you do not ask for it.