To conclude this first series of posts, and before we dwell into other projects, we will write a simple program that verifies we managed to perform all steps correctly, and therefore we are capable of freely using the bridges established between Cozmo, the Voice Kit and our smartphone. In order to access the various examples which will allow us to create the first test program, we should open the "Dev terminal", which can be found on the Desktop, and navigate our way to the voice examples. (use "cd aiyprojects-raspbian/src/examples/voice" to do so). Our "Hello world" will be based on the program "assistant_library_with_local_commands_demo.py", which allows the user to create custom voice commands to be executed or interpreted by the AIY's Voice Kit.
To create such new command, we should first create a string that represents the words the user should say. For instance, if we wish Cozmo to say some words when asked to, we can use the string "robot speak". Make sure it's two words long at most. We should insert these next lines of code after the 77th line in the aforementioned Python script:
To create such new command, we should first create a string that represents the words the user should say. For instance, if we wish Cozmo to say some words when asked to, we can use the string "robot speak". Make sure it's two words long at most. We should insert these next lines of code after the 77th line in the aforementioned Python script:
elif text = 'robot speak'
# insert desired function here
After importing all Cozmo's libraries via writing "import cozmo" at the very beggining of the script, we will also write down the function in charge of making our robot speak. My personal "Hello world" was a sentence I hold dear to me: "The little ones bite, but they're still not a threat". To create said function, we must write the following after all the "imports":
def cozmo_helloworld(robot: cozmo.robot.Robot):
robot.say_text("The little ones bite but they're still not a threat").wait_for_completed()
Finally, we will call this function from the aforementioned "elif":
elif text = 'robot speak'
cozmo.run_program(cozmo_helloworld)
This way, when we execute the "assistant_library_with_local_commands_demo.py" script, we only have to wait until the program boots up, and then say "OK Google" to trigger it's "listening" mechanic (the button wil light up if the Voice Kit is listening). Then, we can say "robot speak" and watch how Cozmo says teh sentence we previously wrote ("The little ones bite, but they're still not a threat"). this very same process allows the user not only to make Cozmo say anything they want, but also to run more complex commands, involving movement, artifical vision and path learning, all using Cozmo's SDK libraries. A small example could be this small clip where my mother asks Cozmo to recite some verses from José de espronceda's "Canción del Pirata":
The only thing left now is to let our imagination flow to get our little buddy to do everything we want. Perhaps this could be the start of a new project by your side. Feel free to comment with your ideas and experiences. I will also make sure to answer any doubts you might have regarding Cozmo or the Voice Kit.
It has been a pleasure getting involved in this little project, and I hope anyone who reads this gets to feel the same sense of fulfillment I got to experience when watching Cozmo pronounce his first words.
Comments
Post a Comment