Appleās plans to improve App Store discoverability using AI tagging techniques are now available in the developer beta build of iOS 26.
However, the tags do not appear on the public App Store as of yet, nor are they informing the App Store Search algorithm on the public store.

Of course, with any upcoming App Store update, thereās speculation about how changes will impact an appās search ranking.
A new analysis by app intelligence provider Appfigures, for example, suggests metadata extracted from an appās screenshots is influencing its ranking.
The firm theorized that Apple was extracting text from screenshot captions. Previously, only the appās name, subtitle, and keyword list would count towards its search ranking, it said.
The conclusion that screenshots are informing app discoverability is accurate, based on what Apple announced at its Worldwide Developer Conference (WWDC 25), but the way Apple is extracting that data involves AI, not OCR techniques, as Appfigures had guessed.
At its annual developer conference, Apple explained that screenshots and other metadata would be used to help improve an appās discoverability. The company said itās using AI techniques to extract information that would otherwise be buried in an appās description, its category information, its screenshots, or other metadata, for example. That also means that developers shouldnāt need to add keywords to the screenshots or take other steps to influence the tags.
This allows Apple to assign a tag to better categorize the app. Ultimately, developers would be able to control which of these AI-assigned tags would be associated with their apps, the company said.
Plus, Apple assured developers that humans would review the tags before they went live.
In time, it will be important for developers to better understand tags and which ones will help their app get discovered, when the tags reach global App Store users.