برچسب: Android

  • Android Developers Blog: Introducing Widget Quality Tiers



    Posted by Ivy Knight – Senior Design Advocate

    Level up your app Widgets with new quality tiers

    Widgets can be a powerful tool for engaging users and increasing the visibility of your app. They can also help you to improve the user experience by providing users with a more convenient way to access your app’s content and features.

    To build a great Android widget, it should be helpful, adaptive, and visually cohesive with the overall aesthetic of the device home screen.

    In order to help you achieve a great widget, we are pleased to introduce Android Widget Quality Tiers!

    The new Widget quality tiers are here to help guide you towards a best practice implementation of widgets, that will look great and bring your user’s value across the ecosystem of Android Phone, Tablets and Foldables.

    What does this mean for widget makers?

    Whether you are planning a new widget, or investing in an update to an existing widget, the Widget Quality Tiers will help you evaluate and plan for a high quality widget.

    Just like Large Screen quality tiers help optimize app experiences, these Widget tiers guide you in creating great widgets across all Android devices. Now, similar tiers are being introduced for widgets to ensure they’re not just functional, but also visually appealing and user-friendly.

    Two screenshots of a phone display different views in the Google Play app. The first shows a list of running apps with the Widget filter applied in a search for 'Running apps'; the second shows the Nike Run Club app page.

    Widgets that meet quality tier guidelines will be discoverable under the new Widget filter in Google Play.

    Consider using our Canonical Widget layouts, which are based on Jetpack Glance components, to make it easier for you to design and build a Tier 1 widget your users will love.

    Let’s take a look at the Widget Quality Tiers

    There are three tiers built with required system defaults and suggested guidance to create an enhanced widget experience:

    Tier 1: Differentiated

    Four mockups show examples of Material Design 3 dynamic color applied to an app called 'Radio Hour'.

    Differentiated widgets go further by implementing theming and adapting to resizing.

    Tier 1 widgets are exemplary widgets offering hero experiences that are personalized, and create unique and productive homescreens. These widgets meet Tier 2 standards plus enhancements for layout, color, discovery, and system coherence criteria.

    A stylized cartoon figure holds their chin thoughtfully while a chat bubble icon is highlighted

    For example, use the system provided corner radius, and don’t set a custom corner radius on Widgets.

    Add more personalization with dynamic color and generated previews while ensuring your widgets look good across devices by not overriding system defaults.

     Four mockups show examples of Material Design 3 components on Android: a contact card, a podcast player, a task list, and a news feed.

    Tier 1 widgets that, from the top left, properly crop content, fill the layout bounds, have appropriately sized headers and touch targets, and make good use of colors and contrast.

    Tier 2: Quality Standard

    These widgets are helpful, usable, and provide a quality experience. They meet all criteria for layout, color, discovery, and content.

    A simple to-do list app widget displays two tasks: 'Water plants' and 'Water more plants.' Both tasks have calendar icons next to them. The app is titled 'Plants' and has search and add buttons in the top right corner.

    Make sure your widget has appropriate touch targets.

    Tier 2 widgets are functional but simple, they meet the basic criteria for a usable app. But if you want to create a truly stellar experience for your users, tier 1 criteria introduce ways to make a more personal, interactive, and coherent widget.

    Tier 3: Low Quality

    These widgets don’t meet the minimum quality bar and don’t provide a great user experience, meaning they are not following or missing criteria from Tier 2.

     Examples of Material Design 3 widgets are displayed on a light pink background with stylized X shapes. Widgets include a podcast player, a contact card, to-do lists, and a music player.

    Clockwise from the top left not filling the bounds, poorly cropped content, low color contrast, mis-sized header, and small touch targets.

    A stylized cartoon person with orange hair, a blue shirt, holds a pencil to their cheek.  'Kacie' is written above them, with a cut off chat bubble icon.

    For example, ensure content is visible and not cropped

    Build and elevate your Android widgets with Widget Quality Tiers

    Dive deeper into the widget quality tiers and start building widgets that not only look great but also provide an amazing user experience! Check out the official Android documentation for detailed information and best practices.


    This blog post is part of our series: Spotlight Week on Widgets, where we provide resources—blog posts, videos, sample code, and more—all designed to help you design and create widgets. You can read more in the overview of Spotlight Week: Widgets, which will be updated throughout the week.



    Source link

  • Generate stunning visuals in your Android apps with Imagen 3 via Vertex AI in Firebase



    Posted by Thomas Ezan Sr. – Android Developer Relation Engineer (@lethargicpanda)

    Imagen 3, our most advanced image generation model, is now available through Vertex AI in Firebase, making it even easier to integrate it to your Android apps.

    Designed to generate well-composed images with exceptional details, reduced artifacts, and rich lighting, Imagen 3 represents a significant leap forward in image generation capabilities.

    Hot air balloons float over a scenic desert landscape with unique rock formations.

    Image generated by Imagen 3 with prompt: “Shot in the style of DSLR camera with the polarizing filter. A photo of two hot air balloons over the unique rock formations in Cappadocia, Turkey. The colors and patterns on these balloons contrast beautifully against the earthy tones of the landscape below. This shot captures the sense of adventure that comes with enjoying such an experience.”

    A wooden robot stands in a field of yellow flowers, holding a small blue bird on its outstretched hand.

    Image generated by Imagen 3 with prompt: A weathered, wooden mech robot covered in flowering vines stands peacefully in a field of tall wildflowers, with a small blue bird resting on its outstretched hand. Digital cartoon, with warm colors and soft lines. A large cliff with a waterfall looms behind.

    Imagen 3 unlocks exciting new possibilities for Android developers. Generated visuals can adapt to the content of your app, creating a more engaging user experience. For instance, your users can generate custom artwork to enhance their in-app profile. Imagen can also improve your app’s storytelling by bringing its narratives to life with delightful personalized illustrations.

    You can experiment with image prompts in Vertex AI Studio, and learn how to improve your prompts by reviewing the prompt and image attribute guide.

    Get started with Imagen 3

    The integration of Imagen 3 is similar to adding Gemini access via Vertex AI in Firebase. Start by adding the gradle dependencies to your Android project:

    dependencies {
        implementation(platform("com.google.firebase:firebase-bom:33.10.0"))
    
        implementation("com.google.firebase:firebase-vertexai")
    }
    

    Then, in your Kotlin code, create an ImageModel instance by passing the model name and optionally, a model configuration and safety settings:

    val imageModel = Firebase.vertexAI.imagenModel(
      modelName = "imagen-3.0-generate-001",
      generationConfig = ImagenGenerationConfig(
        imageFormat = ImagenImageFormat.jpeg(compresssionQuality = 75),
        addWatermark = true,
        numberOfImages = 1,
        aspectRatio = ImagenAspectRatio.SQUARE_1x1
      ),
      safetySettings = ImagenSafetySettings(
        safetyFilterLevel = ImagenSafetyFilterLevel.BLOCK_LOW_AND_ABOVE
        personFilterLevel = ImagenPersonFilterLevel.ALLOW_ADULT
      )
    )
    

    Finally generate the image by calling generateImages:

    val imageResponse = imageModel.generateImages(
      prompt = "An astronaut riding a horse"
    )
    

    Retrieve the generated image from the imageResponse and display it as a bitmap as follow:

    val image = imageResponse.images.first()
    val uiImage = image.asBitmap()
    

    Next steps

    Explore the comprehensive Firebase documentation for detailed API information.

    Access to Imagen 3 using Vertex AI in Firebase is currently in Public Preview, giving you an early opportunity to experiment and innovate. For pricing details, please refer to the Vertex AI in Firebase pricing page.

    Start experimenting with Imagen 3 today! We’re looking forward to seeing how you’ll leverage Imagen 3’s capabilities to create truly unique, immersive and personalized Android experiences.



    Source link

  • Announcing Android support of digital credentials



    Posted by Rohey Livne – Group Product Manager

    In today’s interconnected world, managing digital identity is essential. Android aims to support open standards that ensure seamless interoperability with various identity providers and services. As part of this goal, we are excited to announce that Android, via Credential Manager’s DigitalCredential API, now natively supports OpenID4VP and OpenID4VCI for digital credential presentation and issuance respectively.

    What are digital credentials?

    Digital credentials are cryptographically verifiable documents. The most common emerging use case for digital credentials is identity documents such as driver’s licenses, passports, or national ID cards. In the coming years, it is anticipated that Android developers will develop innovative applications of this technology for a wider range of personal credentials that users will need to present digitally, including education certifications, insurance policies, memberships, permits, and more.

    Digital credentials can be provided by any installed Android app. These apps are known as “credential holders”; typically digital wallet apps such as Google Wallet or Samsung Wallet.

    Other apps not necessarily thought of as “wallets” may also have a use for exposing a digital credential. For example an airline app might want to offer their users’ air miles reward program membership as a digital credential to be presented to other apps or websites.

    Digital credentials can be presented by the user to any other app or website on the same device, and Android also supports securely presenting Digital Credentials between devices using the same industry standard protocols used by passkeys (CTAP), by establishing encrypted communication tunnels.

    Users can store multiple credentials across multiple apps on their device. By leveraging OpenID4VP requests from websites using the W3C Digital Credential API, or from native apps using Android Credential Manager API, a user can select what credential to present from across all available credentials across all installed digital wallet apps.

    How digital credentials work

    Presentation

    To present the credential, the verifier sends an OpenID4VP request to the Digital Credential API, which then prompts the user to select a credential across all the credentials that can satisfy this request. Note that the user is selecting a credential, not a digital wallet app:

    Digital credentials selection interface on a mobile device

    Digital credentials selection interface

    Once the user chooses a credential to proceed with, Android platform redirects the original OpenID4VP request to the digital wallet app that holds the chosen credential to complete the presentation back to the verifier. When the digital wallet app receives the OpenID4VP request from Android, it can also perform any additional due-diligence steps it needs to perform prior to releasing the credential to the verifier.

    Issuance

    Android also allows developers to issue their own Digital Credentials to a user’s digital wallet app. This process can be done using an OpenID4VCI request, which prompts the user to choose the digital wallet app that they want to store the credential in. Alternatively, the issuance could be done directly from within the digital wallet app (some apps might not even have an explicit user facing issuance step if they store credentials based on their association to a signed-in user account).

    a single credential in a user's digital wallet app

    A wallet app holds a single credential

    Over time, the user can repeat this process to issue multiple credentials across multiple digital wallet apps:

    multiple credentials in multiple digital wallets held by a single user

    Multiple wallet apps hold multiple credentials

    Note: To ensure that at presentation time Android can appropriately list all the credentials that digital wallet apps hold, digital wallets must register their credentials’ metadata with Credential Manager. Credential Manager uses this metadata to match credentials across available digital wallet apps to the verifier’s request, so that it can only present a list of valid credentials that can satisfy the request for the user to select from.

    Early adopters

    As Google Wallet announced yesterday, soon users will be able to use digital credentials to recover Amazon accounts, access online health services with CVS and MyChart by Epic, and verify profiles or identity on platforms like Uber and Bumble.

    These use cases will take advantage of users’ digital credentials stored in any digital wallet app users have on their Android device. To that end, we’re also happy to share that both Samsung Wallet and 1Password will hold users’ digital credentials as digital wallets and support OpenID standards via Android’s Credential Manager API.

    Learn more

    Credential Manager API lets every Android app implement credential verification or provide credentials on the Android platform.

    Check out our new digital credential documentation on how to become a credential verifier, taking advantage of users’ existing digital credentials using Jetpack Credential Manager, or to become a digital wallet app holding your own credentials for other apps or websites to verify.



    Source link

  • Multimodal image attachment is now available for Gemini in Android Studio



    Posted by Paris Hsu – Product Manager, Android Studio

    At every stage of the development lifecycle, Gemini in Android Studio has become your AI-powered companion, making it easier to build high quality apps. We are excited to announce a significant expansion: Gemini in Android Studio now supports multimodal inputs, which lets you attach images directly to your prompts! This unlocks a wealth of new possibilities that improve team collaboration and UI development workflows.

    You can try out this new feature by downloading the latest Android Studio canary. We’ve outlined a few use cases to try, but we’d love to hear what you think as we work through bringing this feature into future stable releases. Check it out:

    https://www.youtube.com/watch?v=f_6mtRWJzuc

    Image attachment – a new dimension of interaction

    We first previewed Gemini’s multimodal capabilities at Google I/O 2024. This technology allows Gemini in Android Studio to understand simple wireframes, and transform them into working Jetpack Compose code.

    You’ll now find an image attachment icon in the Gemini chat window. Simply attach JPEG or PNG files to your prompts and watch Gemini understand and respond to visual information. We’ve observed that images with strong color contrasts yield the best results.

    New “Attach Image File” icon in chat window

    1.1 New “Attach Image File” icon in chat window

    Example of multimodal response in chat

    1.2 Example multimodal response in chat

    We encourage you to experiment with various prompts and images. Here are a few compelling use cases to get you started:

      • Rapid UI prototyping and iteration: Convert a simple wireframe or high-fidelity mock of your app’s UI into working code.
      • Diagram explanation and documentation: Gain deeper insights into complex architecture or data flow diagrams by having Gemini explain their components and relationships.
      • UI troubleshooting: Capture screenshots of UI bugs and ask Gemini for solutions.

    Rapid UI prototyping and iteration

    Gemini’s multimodal support lets you convert visual designs into functional UI code. Simply upload your image and use a clear prompt. It works whether you’re working from your own sketches or from a designer mockup.

    Here’s an example prompt: “For this image provided, write Android Jetpack Compose code to make a screen that’s as close to this image as possible. Make sure to include imports, use Material3, and document the code.” And then you can append any specific or additional instructions related to the image.

    Example prompt: 'For this image provided, write Android Jetpack Compose code to make a screen that's as close to this image as possible. Make sure to include imports, use Material3, and document the code.'

    Example of generating Compose code from high-fidelity mock using Gemini in Android Studio

    2. Example of generating Compose code from high-fidelity mock using Gemini in Android Studio (code output)

    For more complex UIs, refine your prompts to capture specific functionality. For instance, when converting a calculator mockup, adding “make the interactions and calculations work as you’d expect” results in a fully functional calculator:

    Example prompt to convert a calculator mock up

    Example of generating Compose code from high-fidelity mock using Gemini in Android Studio

    3. Example of generating Compose code from wireframe via Gemini in Android Studio (code output)

    Note: this feature provides an initial design scaffold. It’s a good “first draft” and your edits and adjustments will be needed. Common refinements include ensuring correct drawable imports and importing icons. Consider the generated code a highly efficient starting point, accelerating your UI development workflow.

    Diagram explanation and documentation

    With Gemini’s multimodal capabilities, you can also try uploading an image of your diagram and ask for explanations or documentation.

    Example prompt: Upload the Now in Android architecture diagram and say “Explain the components and data flow in this diagram” or “Write documentation about this diagram”.

    Example of generating Compose code from high-fidelity mock using Gemini in Android Studio

    4. Example of asking Gemini to help document the NowInAndroid architecture diagram

    UI troubleshooting

    Leverage Gemini’s visual analysis to identify and resolve bugs quickly. Upload a screenshot of the problematic UI, and Gemini will analyze the image and suggest potential solutions. You can also include relevant code snippets for more precise assistance.

    In the example below, we used Compose UI check and found that the button is stretched too wide in tablet screens, so we took a screenshot and asked Gemini for solutions – it was able to leverage the window size classes to provide the right fix.

    Example of generating Compose code from high-fidelity mock using Gemini in Android Studio

    5. Example of fixing UI bugs using Image Attachment (code output)

    Download Android Studio today

    Download the latest Android Studio canary today to try the new multimodal features!

    As always, Google is committed to the responsible use of AI. Android Studio won’t send any of your source code to servers without your consent. You can read more on Gemini in Android Studio’s commitment to privacy.

    We appreciate any feedback on things you like or features you would like to see. If you find a bug, please report the issue and also check out known issues. Remember to also follow us on X, Medium, or YouTube for more Android development updates!





    Source link

  • Multimodal for Gemini in Android Studio, news for gaming devs, the latest devices at MWC, XR and more!



    Posted by Anirudh Dewani – Director, Android Developer Relations

    We just dropped our Winter episode of #TheAndroidShow, on YouTube and on developer.android.com, and this time we were in Barcelona to give you the latest from Mobile World Congress and across the Android Developer world. We unveiled a big update to Gemini in Android Studio (multi-modal support, so you can translate image to code) and we shared some news for games developers ahead of GDC later this month. Plus we unpacked the latest Android hardware devices from our partners coming out of Mobile World Congress and recapped all of the latest in Android XR. Let’s dive in!

    https://www.youtube.com/watch?v=-Drt3YeIMuc

    Multimodality image-to-code, now available for Gemini in Android Studio

    At every stage of the development lifecycle, Gemini in Android Studio has become your AI-powered companion. Today, we took the wraps off a new feature: Gemini in Android Studio now supports multimodal image to code, which lets you attach images directly to your prompts! This unlocks a wealth of new possibilities that improve collaboration and design workflows. You can try out this new feature by downloading the latest canary – Android Studio Narwal, and read more about multimodal image attachment – now available for Gemini in Android Studio.

    https://www.youtube.com/watch?v=f_6mtRWJzuc

    Building excellent games with better graphics and performance

    Ahead of next week’s Games Developer Conference (GDC), we announced new developer tools that will help improve gameplay across the Android ecosystem. We’re making Vulkan the official graphics API on Android, enabling you to build immersive visuals, and we’re enhancing the Android Dynamic Performance Framework (ADPF) to help you deliver longer, more stable gameplay sessions. Learn more about how we’re building excellent games with better graphics and performance.

    https://www.youtube.com/watch?v=SkkkwCEkO6I

    A deep dive into Android XR

    Since we unveiled Android XR in December, it’s been exciting to see developers preparing their apps for the next generation of Android XR devices. In the latest episode of #TheAndroidShow we dove into this new form factor and spoke with a developer who has already been building. Developing for this new platform leverages your existing Android development skills and familiar tools like Android Studio, Kotlin, and Jetpack libraries. The Android XR SDK Developer Preview is available now, complete with an emulator, so you can start experimenting and building XR experiences immediately! Visit developer.android.com/xr for more.

    https://www.youtube.com/watch?v=AkKjMtBYwDA

    New Android foldables and tablets, at Mobile World Congress

    Mobile World Congress is a big moment for Android, with partners from around the world showing off their latest devices. And if you’re already building adaptive apps, we wanted to share some of the cool new foldable and tablets that our partners released in Barcelona:

      • OPPO: OPPO launched their Find N5, their slim 8.93mm foldable with a 8.12” large screen – making it as compact or expansive as needed.
      • Xiaomi: Xiaomi debuted the Xiaomi Pad 7 series. Xiaomi Pad 7 provides a crystal-clear display and, with the productivity accessories, users get a desktop-like experience with the convenience of a tablet.
      • Lenovo: Lenovo showcased their Yoga Tab Plus, the latest powerful tablet from their lineup designed to empower creativity and productivity.

    These new devices are a great reason to build adaptive apps that scale across screen sizes and device types. Plus, Android 16 removes the ability for apps to restrict orientation and resizability at the platform level, so you’ll want to prepare. To help you get started, the Compose Material 3 adaptive library enables you to quickly and easily create layouts across all screen sizes while reducing the overall development cost.

    https://www.youtube.com/watch?v=KqkUQpsQ2QA

    Watch the Winter episode of #TheAndroidShow

    That’s a wrap on this quarter’s episode of #TheAndroidShow. A special thanks to our co-hosts for the Fall episode, Simona Milanović and Alejandra Stamato! You can watch the full show on YouTube and on developer.android.com/events/show.

    Have an idea for our next episode of #TheAndroidShow? It’s your conversation with the broader community, and we’d love to hear your ideas for our next quarterly episode – you can let us know on X or LinkedIn.





    Source link

  • The Third Beta of Android 16



    Posted by Matthew McCullough – VP of Product Management, Android Developer

    Android 16 has officially reached Platform Stability today with Beta 3! That means the API surface is locked, the app-facing behaviors are final, and you can push your Android 16-targeted apps to the Play store right now. Read on for coverage of new security and accessibility features in Beta 3.

    Android delivers enhancements and new features year-round, and your feedback on the Android beta program plays a key role in helping Android continuously improve. The Android 16 developer site has more information about the beta, including how to get it onto devices and the release timeline. We’re looking forward to hearing what you think, and thank you in advance for your continued help in making Android a platform that benefits everyone.

    New in Android 16 Beta 3

    At this late stage in the development cycle, there are only a few new things in the Android 16 Beta 3 release for you to consider when developing your apps.

    Android 16 timeline showing we are on time with Beta releases ending in March

    Broadcast audio support

    Pixel 9 devices on Android 16 Beta now support Auracast broadcast audio with compatible LE Audio hearing aids, part of Android’s work to enhance audio accessibility. Built on the LE Audio standard, Auracast enables compatible hearing aids and earbuds to receive direct audio streams from public venues like airports, concerts, and classrooms. Our Keyword post has more on this technology.

    Outline text for maximum text contrast

    Users with low vision often have reduced contrast sensitivity, making it challenging to distinguish objects from their backgrounds. To help these users, Android 16 Beta 3 introduces outline text, replacing high contrast text, which draws a larger contrasting area around text to greatly improve legibility.

    Android 16 also contains new AccessibilityManager APIs to allow your apps to check or register a listener to see if this mode is enabled. This is primarily for UI Toolkits like Compose to offer a similar visual experience. If you maintain a UI Toolkit library or your app performs custom text rendering that bypasses the android.text.Layout class then you can use this to know when outline text is enabled.

    Text with enhanced contrast before and after Android 16's new outline text accessibility feature

    Text with enhanced contrast before and after Android 16’s new outline text accessibility feature

    Test your app with Local Network Protection

    Android 16 Beta 3 adds the ability to test the Local Network Protection (LNP) feature which is planned for a future Android major release. It gives users more control over which apps can access devices on their local network.

    What’s Changing?

    Currently, any app with the INTERNET permission can communicate with devices on the user’s local network. LNP will eventually require apps to request a specific permission to access the local network.

    Beta 3: Opt-In and Test

    In Beta 3, LNP is an opt-in feature. This is your chance to test your app and identify any parts that rely on local network access. Use this adb command to enable LNP restrictions for your app:

    adb shell am compat enable RESTRICT_LOCAL_NETWORK <your_package_name>
    

    After rebooting your device, your app’s local network access is restricted. Test features that might interact with local devices (e.g., device discovery, media casting, connecting to IoT devices). Expect to see socket errors like EPERM or ECONNABORTED if your app tries to access the local network without the necessary permission. See the developer guide for more information, including how to re-enable local network access.

    This is a significant change, and we’re committed to working with you to ensure a smooth transition. By testing and providing feedback now, you can help us build a more private and secure Android ecosystem.

    Get your apps, libraries, tools, and game engines ready!

    If you develop an SDK, library, tool, or game engine, it’s even more important to prepare any necessary updates now to prevent your downstream app and game developers from being blocked by compatibility issues and allow them to target the latest SDK features. Please let your developers know if updates are needed to fully support Android 16.

    Testing involves installing your production app or a test app making use of your library or engine using Google Play or other means onto a device or emulator running Android 16 Beta 3. Work through all your app’s flows and look for functional or UI issues. Review the behavior changes to focus your testing. Each release of Android contains platform changes that improve privacy, security, and overall user experience, and these changes can affect your apps. Here are several changes to focus on that apply, even if you don’t yet target Android 16:

      • Broadcasts: Ordered broadcasts using priorities only work within the same process. Use other IPC if you need cross-process ordering.
      • ART: If you use reflection, JNI, or any other means to access Android internals, your app might break. This is never a best practice. Test thoroughly.
      • 16KB Page Size: If your app isn’t 16KB-page-size ready, you can use the new compatibility mode flag, but we recommend migrating to 16KB for best performance.

    Other changes that will be impactful once your app targets Android 16:

    Remember to thoroughly exercise libraries and SDKs that your app is using during your compatibility testing. You may need to update to current SDK versions or reach out to the developer for help if you encounter any issues.

    Once you’ve published the Android 16-compatible version of your app, you can start the process to update your app’s targetSdkVersion. Review the behavior changes that apply when your app targets Android 16 and use the compatibility framework to help quickly detect issues.

    Two Android API releases in 2025

    This preview is for the next major release of Android with a planned launch in Q2 of 2025 and we plan to have another release with new developer APIs in Q4. This Q2 major release will be the only release in 2025 that includes behavior changes that could affect apps. The Q4 minor release will pick up feature updates, optimizations, and bug fixes; like our non-SDK quarterly releases, it will not include any intentional app-breaking behavior changes.

    Android API release timeline 2025

    We’ll continue to have quarterly Android releases. The Q1 and Q3 updates provide incremental updates to ensure continuous quality. We’re putting additional energy into working with our device partners to bring the Q2 release to as many devices as possible.

    There’s no change to the target API level requirements and the associated dates for apps in Google Play; our plans are for one annual requirement each year, tied to the major API level.

    Get started with Android 16

    You can enroll any supported Pixel device to get this and future Android Beta updates over-the-air. If you don’t have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio. If you are currently on Android 16 Beta 2 or are already in the Android Beta program, you will be offered an over-the-air update to Beta 3.

    While the API and behaviors are final, we’re still looking for your feedback so please report issues on the feedback page. The earlier we get your feedback, the better chance we’ll be able to address it in this or a future release.

    For the best development experience with Android 16, we recommend that you use the latest feature drop of Android Studio (Meerkat). Once you’re set up, here are some of the things you should do:

      • Compile against the new SDK, test in CI environments, and report any issues in our tracker on the feedback page.

    We’ll update the beta system images and SDK regularly throughout the Android 16 release cycle. Once you’ve installed a beta build, you’ll automatically get future updates over-the-air for all later previews and Betas.

    For complete information on Android 16 please visit the Android 16 developer site.



    Source link

  • Android Developers Blog: #WeArePlay | How Memory Lane Games helps people with dementia



    Posted by Robbie McLachlan – Developer Marketing

    In our latest #WeArePlay film, which celebrates the people behind apps and games, we meet Bruce – a co-founder of Memory Lane Games. His company turns cherished memories into simple, engaging quizzes for people with different types of dementia. Discover how Memory Lane Games blends nostalgia and technology to spark conversations and emotional connections.

    https://www.youtube.com/watch?v=oBDJH8h7FYs

    What inspired the idea behind Memory Lane Games?

    The idea for Memory Lane Games came about one day at the pub when Peter was telling me how his mum, even with vascular dementia, lights up when she looks at old family photos. It got me thinking about my own mum, who treasures old photos just as much. The idea hit us – why not turn those memories into games? We wanted to help people reconnect with their past and create moments where conversations could flow naturally.

    Memory Lane Games co-founders, Peter and Bruce from Isle of Man

    Can you tell us of a memorable moment in the journey when you realized how powerful the game was?

    We knew we were onto something meaningful when a caregiver in a memory cafe told us about a man who was pretty much non-verbal but would enjoy playing. He started humming along to one of our music trivia games, then suddenly said, “Roy Orbison is a way better singer than Elvis, but Elvis had a better manager.” The caregiver was in tears—it was the first complete sentence he’d spoken in months. Moments like these remind us why we’re doing this—it’s not just about games; it’s about unlocking moments of connection and joy that dementia often takes away.

    A user plays Memory Lane Games from their phone

    One of the key features is having errorless fun with the games, why was that so important?

    We strive for frustration-free design. With our games, there are no wrong answers—just gentle prompts to trigger memories and spark conversations about topics they are interested in. It’s not about winning or losing; it’s about rekindling connections and creating moments of happiness without any pressure or frustration. Dementia can make day-to-day tasks challenging, and the last thing anyone needs is a game that highlights what they might not remember or get right. Caregivers also like being able to redirect attention back to something familiar and fun when behaviour gets more challenging.

    How has Google Play helped your journey?

    What’s been amazing is how Google Play has connected us with an incredibly active and engaged global community without any major marketing efforts on our part.

    For instance, we got our first big traction in places like the Philippines and India—places we hadn’t specifically targeted. Yet here we are, with thousands of downloads in more than 100 countries. That reach wouldn’t have been possible without Google Play.

    A group of senior citizen gather around a table to play a round of Memory Lane Games from a shared mobile device

    What is next for Memory Lane Games?

    We’re really excited about how we can use AI to take Memory Lane Games to the next level. Our goal is to use generative AI, like Google’s Gemini, to create more personalized and localized game content. For example, instead of just focusing on general memories, we want to tailor the game to a specific village the player came from, or a TV show they used to watch, or even local landmarks from their family’s hometown. AI will help us offer games that are deeply personal. Plus, with the power of AI, we can create games in multiple languages, tapping into new regions like Japan, Nigeria or Mexico.

    Discover other inspiring app and game founders featured in #WeArePlay.

    How useful did you find this blog post?






    Source link

  • Prioritize media privacy with Android Photo Picker and build user trust



    Posted by Tatiana van Maaren – Global T&S Partnerships Lead, Privacy & Security, and Roxanna Aliabadi Walker – Product Manager

    At Google Play, we’re dedicated to building user trust, especially when it comes to sensitive permissions and your data. We understand that managing files and media permissions can be confusing, and users often worry about which files apps can access. Since these files often contain sensitive information like family photos or financial documents, it’s crucial that users feel in control. That’s why we’re working to provide clearer choices, so users can confidently grant permissions without sacrificing app functionality or their privacy.

    Below are a set of best practices to consider for improving user trust in the sharing of broad access files, ultimately leading to a more successful and sustainable app ecosystem.

    Prioritize user privacy with data minimization

    Building user trust starts with requesting only the permissions essential for your app’s core functions. We understand that photos and videos are sensitive data, and broad access increases security risks. That’s why Google Play now restricts READ_MEDIA_IMAGES and READ_MEDIA_VIDEO permissions, allowing developers to request them only when absolutely necessary, typically for apps like photo/video managers and galleries.

    Leverage privacy-friendly solutions

    Instead of requesting broad storage access, we encourage developers to use the Android Photo Picker, introduced in Android 13. This tool offers a privacy-centric way for users to select specific media files without granting access to their entire library. Android photo picker provides an intuitive interface, including access to cloud-backed photos and videos, and allows for customization to fit your app’s needs. In addition, this system picker is backported to Android 4.4, ensuring a consistent experience for all users. By eliminating runtime permissions, Android photo picker simplifies the user experience and builds trust through transparency.

    Build trust through transparent data practices

    We understand that some developers have historically used custom photo pickers for tailored user experiences. However, regardless of whether you use a custom or system picker, transparency with users is crucial. Users want to know why your app needs access to their photos and videos.

    Developers should strive to provide clear and concise explanations within their apps, ideally at the point where the permission is requested. Take the following in consideration while crafting your permission request mechanisms as possible best practices guidelines:

      • When requesting media access, provide clear explanations within your app. Specifically, tell users which media your app needs (e.g., all photos, profile pictures, sharing videos) and explain the functionality that relies on it (e.g., ‘To choose a profile picture,’ ‘To share videos with friends’).
      • Clearly outline how user data will be used and protected in your privacy policies. Explain whether data is stored locally, transmitted to a server, or shared with third parties. Reassure users that their data will be handled responsibly and securely.

    Learn how Snap has embraced the Android System Picker to prioritize user privacy and streamline their media selection experience. Here’s what they have to say about their implementation:

    A grid of photos in the photo library is shown on a smartphone screen, including a waterfall and two people smiling and posing for the camera. The Google Photos interface is at the top, with the Photos tab selected, and one photo from the grid is selected for use

    “One of our goals is to provide a seamless and intuitive communication experience while ensuring Snapchatters have control over their content. The new flow of the Android Photo Picker is the perfect balance of providing user control of the content they want to share while ensuring fast communication with friends on Snapchat.”

    Marc Brown, Product Manager

    Get started

    Start building a more trustworthy app experience. Explore the Android Photo Picker and implement privacy-first data practices today.

    Acknowledgement

    Special thanks to: May Smith – Product Manager, and Anita Issagholyan – Senior Policy Specialist



    Source link

  • New Android Vitals Metrics are here



    Posted by Karan Jhavar – Product Manager, Android Frameworks, and Dan Brown – Product Manager, Google Play

    Android has long championed performance, continuously evolving to deliver exceptional user experiences. Building upon years of refinement, we’re now focusing on pinpointing resource-intensive use cases and developing platform-level solutions that benefit all users, across the vast Android ecosystem.

    Since the launch of Android vitals in Play Console in 2017, Play has been investing in providing fleet-wide visibility into performance issues, making it easier to identify and fix problems as they occur. Today, Android and Google Play are taking a significant step forward in partnership with top OEMs, like Samsung, leveraging their real-world insights into excessive resource consumption. Our shared goal is to make Android development more streamlined and consistent by providing a standardized definition of what good and great looks like when it comes to technical quality.

    “Samsung is excited to collaborate with Android and Google Play on these new performance metrics. By sharing our user experience insights, we aim to help developers build truly optimized apps that deliver exceptional performance and battery life across the ecosystem. We believe this collaboration will lead to a more consistent and positive experience for all Android users.”

    Samsung

    We’re embarking on a multi-year plan to empower you with the tools and data you need to understand, diagnose, and improve your app’s resource consumption, resulting in happier and more engaged users, both for your app, and Android as a whole.

    Today, we’re launching the first of these new metrics in beta: excessive wake locks. This metric directly addresses one of the most significant frustrations for Android users – excessive battery drain. By optimizing your app’s wake lock behavior, you can significantly enhance battery life and user satisfaction.

    The Android vitals beta metric reports partial wake lock use as excessive when all of the partial wake locks, added together, run for more than 3 hours in a 24-hour period. The current iteration of excessive wake lock metrics tracks time only if the wake lock is held when the app is in the background and does not have a foreground service.

    These new metrics will provide comprehensive, fleet-wide visibility into performance and battery life, equipping developers with the data needed to diagnose and resolve performance bottlenecks. We have also revamped our wake lock documentation which shares effective wake lock implementation strategies and best practices.

    In addition, we are also launching the excessive wake lock metric documentation to provide clear guidance on interpreting the metrics. We highly encourage developers to check out this page and provide feedback with their use case on this new metric. Your input is invaluable in refining these metrics before their general availability. In this beta phase, we’re actively seeking feedback on the metric definition and how it aligns with your app’s use cases. Once we reach general availability, we will explore Play Store treatments to help users choose apps that meet their needs.

    Later this year, we may introduce additional metrics in Android vitals highlighting additional critical performance issues.

    Thank you for your ongoing commitment to delivering delightful, fast, and high-performance experiences to users across the entire Android ecosystem.



    Source link

  • The Fourth Beta of Android 16



    Posted by Matthew McCullough – VP of Product Management, Android Developer

    Today we’re bringing you Android 16 beta 4, the last scheduled update in our Android 16 beta program. Make sure your app or game is ready. It’s also the last chance to give us feedback before Android 16 is released.

    Android 16 Beta 4

    This is our second platform stability release; the developer APIs and all app-facing behaviors are final. Apps targeting Android 16 can be made available in Google Play. Beta 4 includes our latest fixes and optimizations, giving you everything you need to complete your testing. Head over to our Android 16 summary page for a list of the features and behavior changes we’ve been covering in this series of blog posts, or read on for some of the top changes of which you should be aware.

    Android 16 Release timeline showing Platform Stability milestone in April

    Now available on more devices

    The Android 16 Beta is now available on handset, tablet, and foldable form factors from partners including Honor, iQOO, Lenovo, OnePlus, OPPO, Realme, vivo, and Xiaomi. With more Android 16 partners and device types, many more users can run your app on the Android 16 Beta.

    Android 16 Beta Release Partners: Google Pixel, iQOO, Lenovo, OnePlus, Sharp, Oppo, RealMe, vivo, Xiaomi, and Honor

    Get your apps, libraries, tools, and game engines ready!

    If you develop an SDK, library, tool, or game engine, it’s even more important to prepare any necessary updates now to prevent your downstream app and game developers from being blocked by compatibility issues and allow them to target the latest SDK features. Please let your developers know if updates to your SDK are needed to fully support Android 16.

    Testing involves installing your production app or a test app making use of your library or engine using Google Play or other means onto a device or emulator running Android 16 Beta 4. Work through all your app’s flows and look for functional or UI issues. Review the behavior changes to focus your testing. Each release of Android contains platform changes that improve privacy, security, and overall user experience, and these changes can affect your apps. Here are several changes to focus on that apply, even if you aren’t yet targeting Android 16:

      • Broadcasts: Ordered broadcasts using priorities only work within the same process. Use other IPC if you need cross-process ordering.
      • ART: If you use reflection, JNI, or any other means to access Android internals, your app might break. This is never a best practice. Test thoroughly.
      • 16KB Page Size: If your app isn’t 16KB-page-size ready, you can use the new compatibility mode flag, but we recommend migrating to 16KB for best performance.

    Other changes that will be impactful once your app targets Android 16:

    Get your app ready for the future:

      • Local network protection: Consider testing your app with the upcoming Local Network Protection feature. It will give users more control over which apps can access devices on their local network in a future Android major release.

    Remember to thoroughly exercise libraries and SDKs that your app is using during your compatibility testing. You may need to update to current SDK versions or reach out to the developer for help if you encounter any issues.

    Once you’ve published the Android 16-compatible version of your app, you can start the process to update your app’s targetSdkVersion. Review the behavior changes that apply when your app targets Android 16 and use the compatibility framework to help quickly detect issues.

    Two Android API releases in 2025

    This Beta is for the next major release of Android with a planned launch in Q2 of 2025 and we plan to have another release with new developer APIs in Q4. This Q2 major release will be the only release in 2025 that includes behavior changes that could affect apps. The Q4 minor release will pick up feature updates, optimizations, and bug fixes; like our non-SDK quarterly releases, it will not include any intentional app-breaking behavior changes.

    Android 16 2025 SDK release timeline

    We’ll continue to have quarterly Android releases. The Q1 and Q3 updates provide incremental updates to ensure continuous quality. We’re putting additional energy into working with our device partners to bring the Q2 release to as many devices as possible.

    There’s no change to the target API level requirements and the associated dates for apps in Google Play; our plans are for one annual requirement each year, tied to the major API level.

    Get started with Android 16

    You can enroll any supported Pixel device to get this and future Android Beta updates over-the-air. If you don’t have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio. If you are currently on Android 16 Beta 3 or are already in the Android Beta program, you will be offered an over-the-air update to Beta 4.

    While the API and behaviors are final and we are very close to release, we’d still like you to report issues on the feedback page. The earlier we get your feedback, the better chance we’ll be able to address it in this or a future release.

    For the best development experience with Android 16, we recommend that you use the latest Canary build of Android Studio Narwhal. Once you’re set up, here are some of the things you should do:

      • Compile against the new SDK, test in CI environments, and report any issues in our tracker on the feedback page.

    We’ll update the beta system images and SDK regularly throughout the Android 16 release cycle. Once you’ve installed a beta build, you’ll automatically get future updates over-the-air for all later previews and Betas.

    For complete information on Android 16 please visit the Android 16 developer site.



    Source link