Skip to main content

With developer program, Google Photos is about to become a lot more versatile

What if Google Photos could digitize your old photos, turn receipt photos into an expense list or send new photos to that digital photo album? Well, Google Photos will soon be able to tackle all that and more — but Google is not exactly the one building in all those features. On Wednesday, May 9 during I/O, the tech giant announced the Google Photos Partner Program. The program allows developers to integrate Google Photos into their apps for a more seamless experience, expanding Google Photos beyond just that native app.

The developer’s program gives app designers code that allows them to access specific features of the Google Photos app. The move allows apps to access the Google Photos library, add to that library or share from that library. 

Recommended Videos

While the API will allow a number of different developers to access the tools, Google’s presentation during I/O already lists a number of apps integrating the tools. Timehop can use the access to surface old memories from Google Photos. HP Sprocket, a mobile printer, can access those photos from the printer’s app to create a physical copy. Expense app Xero can look through the Google Photos Library to collect photos of receipts to include in an expense list. Legacy Republic, a company that digitizes old photo albums, can use the API to deposit scanned photos into the user’s Google Photos account. And digital frame company Nixplay can automatically send images sent to a shared album to the frame.

The API allows developers to build Google Photos integration into their apps in three different ways. Through a connection, which allows the app to access Google Photos with the user’s permission. An upload option allows other apps to send images to Google Photos or to create an album. And a sharing function allows other apps to create shared Albums — like a photo booth company creating a shared folder so you have more than that spit out physical print.

While some of the potential updates are just simple convenience, app developers will also be able to access some of the technology behind Google Photos. The apps can filter images to help users find the right one using Google-built filters, for example.

Google is providing the access — what stems from the developer’s program is up to third-party app developers and the structure and limitations of the API. Google, however, says they use a stringent app review process before allowing access, which includes checking up on apps rather than a one-and-done type of approval process. The program uses strict privacy guidelines, Google says, and users have to explicitly allow the app access to enable the access, uploads, and shares.

The new developer partner program comes after Google announced new artificial intelligence tools inside Google Photos, including suggested actions and edits, along with colorizing black and white photos.

Hillary K. Grigonis
Hillary never planned on becoming a photographer—and then she was handed a camera at her first writing job and she's been…
Gemini has killed Google Assistant to become the AI future of Android
Gemini running on the Google Pixel 9 Pro Fold.

Artificial intelligence is spreading its ample wings throughout the Android operating system, right down to Google's decision to rebuild the assistant experience entirely to integrate it inside Android. It means Google Assistant has gone the way of the dinosaur, relegated to the history books as it’s replaced by the next big thing: Google Gemini. What better way to introduce the changes than letting Gemini tell you itself.

“Gemini, Google AI's latest innovation, is set to redefine the Android user experience. By deeply integrating Gemini into Android's core, users can now interact with the AI more naturally, getting assistance with tasks and information retrieval directly within apps. Gemini can even generate images and summarize calls or organize screenshots, all while prioritizing user privacy with on-device processing capabilities.

Read more
4 ways Google is making Android more accessible to everyone
Updates to Android accessibility features as of August 2024.

While most of the attention will inevitably be focused on the Pixel 9 and Pixel 9 Pro today, Google also made some interesting announcements around accessibility in Android at its Made by Google event. Also, likely to the surprise of nobody at all, they include some AI. Here are the four ways Google is improving accessibility in Android.
Magnifier

Originally released in 2023, Magnifier is a very helpful app that only works on Pixel phones. It uses the camera to help people zoom in on the world around them to make reading signs, menus, and other visual guides easier. By integrating AI into Magnifier, it now has a visual search using keywords so you can find relevant terms quickly. Plus, a picture-in-picture view gives you both an overview of what you’re looking at, along with any zoomed-in area.

Read more
The Google Wallet app is about to get a lot more useful
The new Google Wallet app running on an Android phone.

The Google Wallet app for Android is getting a new feature called "Everything Else," which will make it easier to add digital passes. This feature was first announced at Google I/O in May and is expected to be available to all Google Wallet customers in the U.S. by the end of the month. It's currently being rolled out to customers.

Everything Else is replacing Google Wallet's "Photo" option. The feature lets you scan a physical card using your phone's camera. Once you do, artificial intelligence determines what type of card you're scanning. When you take a photo of your physical card, Google will extract the information it can, then let you edit standard fields and add your own.

Read more