Finally got a chance to look in detail at the Allrecipe skill in Amazon's Echo Alexa. This lets you voice search 60K+ recipes and then have them recited step by step or sent to your device. Works well, but has limitations in ease of use, depending on your context. We used this-rough cut model of process to frame a method: What do you need?, what do you do? A recipe of resources and actions. Useful beyond a to-do list.
Such a method could also be used to store and access steps in a business processes. Perhaps leading to better standardization of process. This probably works best in hands-free situations. Useful for a salesperson in a car, or a repair person stepping through a checklist. The search for the process step description could add other context sensitive and bot-like information to the interaction, based on supporting data.
Today we expect to have ready access to online text resources on lap tops or smartphones, wherever we are. So the method of using voice interaction has its own limitations. But can also be seen as another supportive channel to provide knowledge about process. Leading to better results.
Check out Allrecipes. How would you modify it to do process? Someone working on this? Also on other platforms like phones and Google Home too? Be glad to talk, contribute. - Franz Dill