CNN  — 

Google’s voice assistant software is getting an upgrade.

The company announced at its annual Google I/O developer conference on Tuesday a number of changes coming to Google Assistant.

In the next few months, the digital helper should get better at understanding key people, places, and dates in your life — though you may need to give it some help, first.

During a demo ahead of the conference, Manuel Bronstein, vice president of product for Google Assistant, showed CNN Business how you will, for instance, be able to add details to the Assistant app about people, dates, and places, and then ask questions like, “What’s the weather like at mom’s house this weekend?” or “navigate to daycare,” and get an appropriate response.

Some of this data will be pulled from places such as your contacts, calendar, or places you often visit, but users will also be able to add or delete information that the Assistant knows about them from the Assistant settings.

Google, which offers its assistant on over 1 billion devices, is facing tough competition in the fast-growing market for digital assistants, as Amazon’s Alexa and Apple’s Siri compete on a number of devices as well. There are also mounting privacy concerns as these assistants show up not just in people’s pockets, but in their living rooms, cars, and other places. While they are meant to only start paying attention to your commands after hearing a specific wake word or phrase, in practice these smart assistants often collect snippets of conversation that were not directed toward them.

The Assistant will also get a bit more contextually aware. Starting Tuesday, if you ask it to set a timer for five minutes, once that timer goes off, you can simply say, “Stop” rather than saying a wake-up phrase first. To make this work, Bronstein said, Google (GOOG) needed to make “stop” a command that would push the Assistant to act — but only have it work for a short time after the right contextual trigger (in this case a timer) has been pulled. This particular command will run directly on the device where the Google (GOOG) Assistant resides — such as your smartphone or smart speaker — rather than requiring internet access to operate.

Offline accessibility is a growing focus for Google; the company said that later this year much of the artificial intelligence behind the Assistant will be able to run directly on smartphones. This move should allow the Assistant to process commands and respond to them more quickly. Google also said the change will enable real-time transcription of speech on your phone, with or without internet access. However, this offline functionality will first be available only on the company’s new Pixel smartphones.

Remember Duplex, Google’s eerily human-sounding AI technology that can make restaurant reservations? Since unveiling it last May, the company has been rolling it out to the Assistant on Android smartphones and iPhones. And later this year, Google plans to use this same technology (in a voiceless, less creepy-seeming manner) to let you order the Assistant to buy movie tickets or reserve a rental car on the mobile web — a process that tends to be cumbersome and involve filling in lots of forms.