برچسب: Android

  • Outfit Your Team with Android Tablets for Just $75 Each

    Outfit Your Team with Android Tablets for Just $75 Each


    Disclosure: Our goal is to feature products and services that we think you’ll find interesting and useful. If you purchase them, Entrepreneur may get a small share of the revenue from the sale from our commerce partners.

    Equipping a team with modern, mobile tech can be a balancing act—functionality and performance matter, but so does staying within budget. That’s where this deal on the onn. 11″ Tablet Pro really shines. A Walmart store brand, these onn. tablets are just $74.99 (regularly $159), it’s an easy decision for business leaders looking to scale their tech resources without scaling costs.

    Despite its budget-friendly price tag, this tablet is built for everyday productivity. It runs on Android 13, offering a familiar interface that syncs smoothly with cloud-based apps, email platforms, messaging tools, and more. It’s great for teams already using Android phones—onboarding is minimal, and the user experience is intuitive.

    The large 11-inch LCD is crisp and vibrant with a 2000 x 1200 resolution, making it ideal for streaming presentations, reviewing reports, or even hosting virtual meetings. Whether you’re using it for point-of-sale systems, training materials, front-desk kiosks, or remote communications, this tablet delivers a sharp, responsive experience.

    Under the hood, the 2.2GHz octa-core processor and 4GB of RAM provide reliable speed for multitasking. Combined with 128GB of internal storage (expandable via microSD), there’s plenty of room for documents, media, and business apps. Plus, dual cameras allow for both video conferencing and on-the-go image capture, which is useful for field teams, social media managers, and sales staff.

    Battery life is often a pain point with mobile devices, but this one lasts up to 16 hours, giving your team an all-day companion that won’t die mid-task. Whether it’s used in the office or on the road, charging anxiety becomes a thing of the past.

    And since this is an open-box unit, you’re getting a like-new device at nearly half the price. Each tablet is thoroughly tested and verified. Although the box may exhibit minor signs of handling, the hardware inside remains in new condition.

    Get this onn. 11″ Tablet Pro for just $74.99 (regularly $159) while it’s still available.

    StackSocial prices subject to change.

    Equipping a team with modern, mobile tech can be a balancing act—functionality and performance matter, but so does staying within budget. That’s where this deal on the onn. 11″ Tablet Pro really shines. A Walmart store brand, these onn. tablets are just $74.99 (regularly $159), it’s an easy decision for business leaders looking to scale their tech resources without scaling costs.

    Despite its budget-friendly price tag, this tablet is built for everyday productivity. It runs on Android 13, offering a familiar interface that syncs smoothly with cloud-based apps, email platforms, messaging tools, and more. It’s great for teams already using Android phones—onboarding is minimal, and the user experience is intuitive.

    The large 11-inch LCD is crisp and vibrant with a 2000 x 1200 resolution, making it ideal for streaming presentations, reviewing reports, or even hosting virtual meetings. Whether you’re using it for point-of-sale systems, training materials, front-desk kiosks, or remote communications, this tablet delivers a sharp, responsive experience.

    The rest of this article is locked.

    Join Entrepreneur+ today for access.



    Source link

  • Android Developers Blog: The Android Show: I/O Edition



    Posted by Matthew McCullough – Vice President, Product Management, Android Developer

    We just dropped an I/O Edition of The Android Show, where we unpacked exciting new experiences coming to the Android ecosystem: a fresh and dynamic look and feel, smarts across your devices, and enhanced safety and security features. Join Sameer Samat, President of Android Ecosystem, and the Android team to learn about exciting new development in the episode below, and read about all of the updates for users.

    Tune into Google I/O next week – including the Developer Keynote as well as the full Android track of sessions – where we’re covering these topics in more detail and how you can get started.

    https://www.youtube.com/watch?v=l3yDd3CmA_Y

    Start building with Material 3 Expressive

    The world of UX design is constantly evolving, and you deserve the tools to create truly engaging and impactful experiences. That’s why Material Design’s latest evolution, Material 3 Expressive, provides new ways to make your product more engaging, easy to use, and desirable. Learn more, and try out the new Material 3 Expressive: an expansion pack designed to enhance your app’s appeal by harnessing emotional UX, making it more engaging, intuitive, and desirable for users. It comes with new components, motion-physics system, type styles, colors, shapes and more.

    Material 3 Expressive will be coming to Android 16 later this year; check out the Google I/O talk next week where we’ll dive into this in more detail.

    A fluid design built for your watch’s round display

    Wear OS 6, arriving later this year, brings Material 3 Expressive design to Google’s smartwatch platform. New design language puts the round watch display at the heart of the experience, and is embraced in every single component and motion of the System, from buttons to notifications. You’ll be able to try new visual design and upgrade existing app experiences to a new level. Next week, tune in to the What’s New in Android session to learn more.

    Plus some goodies in Android 16…

    We also unpacked some of the latest features coming to users in Android 16, which we’ve been previewing with you for the last few months. If you haven’t already, you can try out the latest Beta of Android 16.

    A few new features that Android 16 adds which developers should pay attention to are Live updates, professional media and camera features, desktop windowing for tablets, major accessibility enhancements and much more:

      • Live Updates allow your app to show time-sensitive progress updates. Use the new ProgressStyle template for an improved experience around navigation, deliveries, and rideshares.

    Watch the What’s New in Android session and the Live updates talk to learn more.

    Tune in next week to Google I/O

    This was just a preview of some Android-related news, so remember to tune in next week to Google I/O, where we’ll be diving into a range of Android developer topics in a lot more detail. You can check out What’s New in Android and the full Android track of sessions to start planning your time.

    We can’t wait to see you next week, whether you’re joining in person or virtually from anywhere around the world!



    Source link

  • Foundational Tools in Android | Kodeco

    Foundational Tools in Android | Kodeco


    Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive
    catalogue of 50+ books and 4,000+ videos.

    Learn more

    © 2025 Kodeco Inc



    Source link

  • Building delightful Android camera and media experiences



    Posted by Donovan McMurray, Mayuri Khinvasara Khabya, Mozart Louis, and Nevin Mital – Developer Relations Engineers

    Hello Android Developers!

    We are the Android Developer Relations Camera & Media team, and we’re excited to bring you something a little different today. Over the past several months, we’ve been hard at work writing sample code and building demos that showcase how to take advantage of all the great potential Android offers for building delightful user experiences.

    Some of these efforts are available for you to explore now, and some you’ll see later throughout the year, but for this blog post we thought we’d share some of the learnings we gathered while going through this exercise.

    Grab your favorite Android plush or rubber duck, and read on to see what we’ve been up to!

    Future-proof your app with Jetpack

    Nevin Mital

    One of our focuses for the past several years has been improving the developer tools available for video editing on Android. This led to the creation of the Jetpack Media3 Transformer APIs, which offer solutions for both single-asset and multi-asset video editing preview and export. Today, I’d like to focus on the Composition demo app, a sample app that showcases some of the multi-asset editing experiences that Transformer enables.

    I started by adding a custom video compositor to demonstrate how you can arrange input video sequences into different layouts for your final composition, such as a 2×2 grid or a picture-in-picture overlay. You can customize this by implementing a VideoCompositorSettings and overriding the getOverlaySettings method. This object can then be set when building your Composition with setVideoCompositorSettings.

    Here is an example for the 2×2 grid layout:

    object : VideoCompositorSettings {
      ...
    
      override fun getOverlaySettings(inputId: Int, presentationTimeUs: Long): OverlaySettings {
        return when (inputId) {
          0 -> { // First sequence is placed in the top left
            StaticOverlaySettings.Builder()
              .setScale(0.5f, 0.5f)
              .setOverlayFrameAnchor(0f, 0f) // Middle of overlay
              .setBackgroundFrameAnchor(-0.5f, 0.5f) // Top-left section of background
              .build()
          }
    
          1 -> { // Second sequence is placed in the top right
            StaticOverlaySettings.Builder()
              .setScale(0.5f, 0.5f)
              .setOverlayFrameAnchor(0f, 0f) // Middle of overlay
              .setBackgroundFrameAnchor(0.5f, 0.5f) // Top-right section of background
              .build()
          }
    
          2 -> { // Third sequence is placed in the bottom left
            StaticOverlaySettings.Builder()
              .setScale(0.5f, 0.5f)
              .setOverlayFrameAnchor(0f, 0f) // Middle of overlay
              .setBackgroundFrameAnchor(-0.5f, -0.5f) // Bottom-left section of background
              .build()
          }
    
          3 -> { // Fourth sequence is placed in the bottom right
            StaticOverlaySettings.Builder()
              .setScale(0.5f, 0.5f)
              .setOverlayFrameAnchor(0f, 0f) // Middle of overlay
              .setBackgroundFrameAnchor(0.5f, -0.5f) // Bottom-right section of background
              .build()
          }
    
          else -> {
            StaticOverlaySettings.Builder().build()
          }
        }
      }
    }
    

    Since getOverlaySettings also provides a presentation time, we can even animate the layout, such as in this picture-in-picture example:

    moving image of picture in picture on a mobile device

    Next, I spent some time migrating the Composition demo app to use Jetpack Compose. With complicated editing flows, it can help to take advantage of as much screen space as is available, so I decided to use the supporting pane adaptive layout. This way, the user can fine-tune their video creation on the preview screen, and export options are only shown at the same time on a larger display. Below, you can see how the UI dynamically adapts to the screen size on a foldable device, when switching from the outer screen to the inner screen and vice versa.

    What’s great is that by using Jetpack Media3 and Jetpack Compose, these features also carry over seamlessly to other devices and form factors, such as the new Android XR platform. Right out-of-the-box, I was able to run the demo app in Home Space with the 2D UI I already had. And with some small updates, I was even able to adapt the UI specifically for XR with features such as multiple panels, and to take further advantage of the extra space, an Orbiter with playback controls for the editing preview.

    moving image of suportive pane adaptive layout

    What’s great is that by using Jetpack Media3 and Jetpack Compose, these features also carry over seamlessly to other devices and form factors, such as the new Android XR platform. Right out-of-the-box, I was able to run the demo app in Home Space with the 2D UI I already had. And with some small updates, I was even able to adapt the UI specifically for XR with features such as multiple panels, and to take further advantage of the extra space, an Orbiter with playback controls for the editing preview.

    moving image of sequential composition preview in Android XR

    Orbiter(
      position = OrbiterEdge.Bottom,
      offset = EdgeOffset.inner(offset = MaterialTheme.spacing.standard),
      alignment = Alignment.CenterHorizontally,
      shape = SpatialRoundedCornerShape(CornerSize(28.dp))
    ) {
      Row (horizontalArrangement = Arrangement.spacedBy(MaterialTheme.spacing.mini)) {
        // Playback control for rewinding by 10 seconds
        FilledTonalIconButton({ viewModel.seekBack(10_000L) }) {
          Icon(
            painter = painterResource(id = R.drawable.rewind_10),
            contentDescription = "Rewind by 10 seconds"
          )
        }
        // Playback control for play/pause
        FilledTonalIconButton({ viewModel.togglePlay() }) {
          Icon(
            painter = painterResource(id = R.drawable.rounded_play_pause_24),
            contentDescription = 
                if(viewModel.compositionPlayer.isPlaying) {
                    "Pause preview playback"
                } else {
                    "Resume preview playback"
                }
          )
        }
        // Playback control for forwarding by 10 seconds
        FilledTonalIconButton({ viewModel.seekForward(10_000L) }) {
          Icon(
            painter = painterResource(id = R.drawable.forward_10),
            contentDescription = "Forward by 10 seconds"
          )
        }
      }
    }
    

    Jetpack libraries unlock premium functionality incrementally

    Donovan McMurray

    Not only do our Jetpack libraries have you covered by working consistently across existing and future devices, but they also open the doors to advanced functionality and custom behaviors to support all types of app experiences. In a nutshell, our Jetpack libraries aim to make the common case very accessible and easy, and it has hooks for adding more custom features later.

    We’ve worked with many apps who have switched to a Jetpack library, built the basics, added their critical custom features, and actually saved developer time over their estimates. Let’s take a look at CameraX and how this incremental development can supercharge your process.

    // Set up CameraX app with preview and image capture.
    // Note: setting the resolution selector is optional, and if not set,
    // then a default 4:3 ratio will be used.
    val aspectRatioStrategy = AspectRatioStrategy(
      AspectRatio.RATIO_16_9, AspectRatioStrategy.FALLBACK_RULE_NONE)
    var resolutionSelector = ResolutionSelector.Builder()
      .setAspectRatioStrategy(aspectRatioStrategy)
      .build()
    
    private val previewUseCase = Preview.Builder()
      .setResolutionSelector(resolutionSelector)
      .build()
    private val imageCaptureUseCase = ImageCapture.Builder()
      .setResolutionSelector(resolutionSelector)
      .setCaptureMode(ImageCapture.CAPTURE_MODE_MINIMIZE_LATENCY)
      .build()
    
    val useCaseGroupBuilder = UseCaseGroup.Builder()
      .addUseCase(previewUseCase)
      .addUseCase(imageCaptureUseCase)
    
    cameraProvider.unbindAll()
    
    camera = cameraProvider.bindToLifecycle(
      this,  // lifecycleOwner
      CameraSelector.DEFAULT_BACK_CAMERA,
      useCaseGroupBuilder.build(),
    )
    

    After setting up the basic structure for CameraX, you can set up a simple UI with a camera preview and a shutter button. You can use the CameraX Viewfinder composable which displays a Preview stream from a CameraX SurfaceRequest.

    // Create preview
    Box(
      Modifier
        .background(Color.Black)
        .fillMaxSize(),
      contentAlignment = Alignment.Center,
    ) {
      surfaceRequest?.let {
        CameraXViewfinder(
          modifier = Modifier.fillMaxSize(),
          implementationMode = ImplementationMode.EXTERNAL,
          surfaceRequest = surfaceRequest,
         )
      }
      Button(
        onClick = onPhotoCapture,
        shape = CircleShape,
        colors = ButtonDefaults.buttonColors(containerColor = Color.White),
        modifier = Modifier
          .height(75.dp)
          .width(75.dp),
      )
    }
    
    fun onPhotoCapture() {
      // Not shown: defining the ImageCapture.OutputFileOptions for
      // your saved images
      imageCaptureUseCase.takePicture(
        outputOptions,
        ContextCompat.getMainExecutor(context),
        object : ImageCapture.OnImageSavedCallback {
          override fun onError(exc: ImageCaptureException) {
            val msg = "Photo capture failed."
            Toast.makeText(context, msg, Toast.LENGTH_SHORT).show()
          }
    
          override fun onImageSaved(output: ImageCapture.OutputFileResults) {
            val savedUri = output.savedUri
            if (savedUri != null) {
              // Do something with the savedUri if needed
            } else {
              val msg = "Photo capture failed."
              Toast.makeText(context, msg, Toast.LENGTH_SHORT).show()
            }
          }
        },
      )
    }
    

    You’re already on track for a solid camera experience, but what if you wanted to add some extra features for your users? Adding filters and effects are easy with CameraX’s Media3 effect integration, which is one of the new features introduced in CameraX 1.4.0.

    Here’s how simple it is to add a black and white filter from Media3’s built-in effects.

    val media3Effect = Media3Effect(
      application,
      PREVIEW or IMAGE_CAPTURE,
      ContextCompat.getMainExecutor(application),
      {},
    )
    media3Effect.setEffects(listOf(RgbFilter.createGrayscaleFilter()))
    useCaseGroupBuilder.addEffect(media3Effect)
    

    The Media3Effect object takes a Context, a bitwise representation of the use case constants for targeted UseCases, an Executor, and an error listener. Then you set the list of effects you want to apply. Finally, you add the effect to the useCaseGroupBuilder we defined earlier.

    moving image of sequential composition preview in Android XR

    (Left) Our camera app with no filter applied. 
     (Right) Our camera app after the createGrayscaleFilter was added.

    There are many other built-in effects you can add, too! See the Media3 Effect documentation for more options, like brightness, color lookup tables (LUTs), contrast, blur, and many other effects.

    To take your effects to yet another level, it’s also possible to define your own effects by implementing the GlEffect interface, which acts as a factory of GlShaderPrograms. You can implement a BaseGlShaderProgram’s drawFrame() method to implement a custom effect of your own. A minimal implementation should tell your graphics library to use its shader program, bind the shader program’s vertex attributes and uniforms, and issue a drawing command.

    Jetpack libraries meet you where you are and your app’s needs. Whether that be a simple, fast-to-implement, and reliable implementation, or custom functionality that helps the critical user journeys in your app stand out from the rest, Jetpack has you covered!

    Jetpack offers a foundation for innovative AI Features

    Mayuri Khinvasara Khabya

    Just as Donovan demonstrated with CameraX for capture, Jetpack Media3 provides a reliable, customizable, and feature-rich solution for playback with ExoPlayer. The AI Samples app builds on this foundation to delight users with helpful and enriching AI-driven additions.

    In today’s rapidly evolving digital landscape, users expect more from their media applications. Simply playing videos is no longer enough. Developers are constantly seeking ways to enhance user experiences and provide deeper engagement. Leveraging the power of Artificial Intelligence (AI), particularly when built upon robust media frameworks like Media3, offers exciting opportunities. Let’s take a look at some of the ways we can transform the way users interact with video content:

      • Empowering Video Understanding: The core idea is to use AI, specifically multimodal models like the Gemini Flash and Pro models, to analyze video content and extract meaningful information. This goes beyond simply playing a video; it’s about understanding what’s in the video and making that information readily accessible to the user.
      • Actionable Insights: The goal is to transform raw video into summaries, insights, and interactive experiences. This allows users to quickly grasp the content of a video and find specific information they need or learn something new!
      • Accessibility and Engagement: AI helps make videos more accessible by providing features like summaries, translations, and descriptions. It also aims to increase user engagement through interactive features.

    A Glimpse into AI-Powered Video Journeys

    The following example demonstrates potential video journies enhanced by artificial intelligence. This sample integrates several components, such as ExoPlayer and Transformer from Media3; the Firebase SDK (leveraging Vertex AI on Android); and Jetpack Compose, ViewModel, and StateFlow. The code will be available soon on Github.

    moving images of examples of AI-powered video journeys

    (Left) Video summarization  
     (Right) Thumbnails timestamps and HDR frame extraction

    There are two experiences in particular that I’d like to highlight:

      • HDR Thumbnails: AI can help identify key moments in the video that could make for good thumbnails. With those timestamps, you can use the new ExperimentalFrameExtractor API from Media3 to extract HDR thumbnails from videos, providing richer visual previews.
      • Text-to-Speech: AI can be used to convert textual information derived from the video into spoken audio, enhancing accessibility. On Android you can also choose to play audio in different languages and dialects thus enhancing personalization for a wider audience.

    Using the right AI solution

    Currently, only cloud models support video inputs, so we went ahead with a cloud-based solution.Iintegrating Firebase in our sample empowers the app to:

      • Generate real-time, concise video summaries automatically.
      • Produce comprehensive content metadata, including chapter markers and relevant hashtags.
      • Facilitate seamless multilingual content translation.

    So how do you actually interact with a video and work with Gemini to process it? First, send your video as an input parameter to your prompt:

    val promptData =
    "Summarize this video in the form of top 3-4 takeaways only. Write in the form of bullet points. Don't assume if you don't know"
    
    val generativeModel = Firebase.vertexAI.generativeModel("gemini-2.0-flash")
    _outputText.value = OutputTextState.Loading
    
    viewModelScope.launch(Dispatchers.IO) {
        try {
            val requestContent = content {
                fileData(videoSource.toString(), "video/mp4")
                text(prompt)
            }
            val outputStringBuilder = StringBuilder()
    
            generativeModel.generateContentStream(requestContent).collect { response ->
                outputStringBuilder.append(response.text)
                _outputText.value = OutputTextState.Success(outputStringBuilder.toString())
            }
    
            _outputText.value = OutputTextState.Success(outputStringBuilder.toString())
    
        } catch (error: Exception) {
            _outputText.value = error.localizedMessage?.let { OutputTextState.Error(it) }
        }
    }
    

    Notice there are two key components here:

      • FileData: This component integrates a video into the query.
      • Prompt: This asks the user what specific assistance they need from AI in relation to the provided video.

    Of course, you can finetune your prompt as per your requirements and get the responses accordingly.

    In conclusion, by harnessing the capabilities of Jetpack Media3 and integrating AI solutions like Gemini through Firebase, you can significantly elevate video experiences on Android. This combination enables advanced features like video summaries, enriched metadata, and seamless multilingual translations, ultimately enhancing accessibility and engagement for users. As these technologies continue to evolve, the potential for creating even more dynamic and intelligent video applications is vast.

    Go above-and-beyond with specialized APIs

    Mozart Louis

    Android 16 introduces the new audio PCM Offload mode which can reduce the power consumption of audio playback in your app, leading to longer playback time and increased user engagement. Eliminating the power anxiety greatly enhances the user experience.

    Oboe is Android’s premiere audio api that developers are able to use to create high performance, low latency audio apps. A new feature is being added to the Android NDK and Android 16 called Native PCM Offload playback.

    Offload playback helps save battery life when playing audio. It works by sending a large chunk of audio to a special part of the device’s hardware (a DSP). This allows the CPU of the device to go into a low-power state while the DSP handles playing the sound. This works with uncompressed audio (like PCM) and compressed audio (like MP3 or AAC), where the DSP also takes care of decoding.

    This can result in significant power saving while playing back audio and is perfect for applications that play audio in the background or while the screen is off (think audiobooks, podcasts, music etc).

    We created the sample app PowerPlay to demonstrate how to implement these features using the latest NDK version, C++ and Jetpack Compose.

    Here are the most important parts!

    First order of business is to assure the device supports audio offload of the file attributes you need. In the example below, we are checking if the device support audio offload of stereo, float PCM file with a sample rate of 48000Hz.

           val format = AudioFormat.Builder()
                .setEncoding(AudioFormat.ENCODING_PCM_FLOAT)
                .setSampleRate(48000)
                .setChannelMask(AudioFormat.CHANNEL_OUT_STEREO)
                .build()
    
            val attributes =
                AudioAttributes.Builder()
                    .setContentType(AudioAttributes.CONTENT_TYPE_MUSIC)
                    .setUsage(AudioAttributes.USAGE_MEDIA)
                    .build()
           
            val isOffloadSupported = 
                if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) {
                    AudioManager.isOffloadedPlaybackSupported(format, attributes)
                } else {
                    false
                }
    
            if (isOffloadSupported) {
                player.initializeAudio(PerformanceMode::POWER_SAVING_OFFLOADED)
            }
    

    Once we know the device supports audio offload, we can confidently set the Oboe audio streams’ performance mode to the new performance mode option, PerformanceMode::POWER_SAVING_OFFLOADED.

    // Create an audio stream
            AudioStreamBuilder builder;
            builder.setChannelCount(mChannelCount);
            builder.setDataCallback(mDataCallback);
            builder.setFormat(AudioFormat::Float);
            builder.setSampleRate(48000);
    
            builder.setErrorCallback(mErrorCallback);
            builder.setPresentationCallback(mPresentationCallback);
            builder.setPerformanceMode(PerformanceMode::POWER_SAVING_OFFLOADED);
            builder.setFramesPerDataCallback(128);
            builder.setSharingMode(SharingMode::Exclusive);
               builder.setSampleRateConversionQuality(SampleRateConversionQuality::Medium);
            Result result = builder.openStream(mAudioStream);
    

    Now when audio is played back, it will be offloading audio to the DSP, helping save power when playing back audio.

    There is more to this feature that will be covered in a future blog post, fully detailing out all of the new available APIs that will help you optimize your audio playback experience!

    What’s next

    Of course, we were only able to share the tip of the iceberg with you here, so to dive deeper into the samples, check out the following links:

    Hopefully these examples have inspired you to explore what new and fascinating experiences you can build on Android. Tune in to our session at Google I/O in a couple weeks to learn even more about use-cases supported by solutions like Jetpack CameraX and Jetpack Media3!



    Source link

  • Android Studio Meerkat Feature Drop is stable



    Posted by Adarsh Fernando, Group Product Manager

    Today, we’re excited to announce the stable release of Android Studio Meerkat Feature Drop (2024.3.2)!

    This release brings a host of new features and improvements designed to boost your productivity and enhance your development workflow. With numerous enhancements, this latest release helps you build high-quality Android apps faster and more efficiently: streamlined Jetpack Compose previews, new Gemini capabilities, better Kotlin Multiplatform (KMP) integration, improved device management, and more.

    Read on to learn about the key updates in Android Studio Meerkat Feature Drop, and download the latest stable version today to explore them yourself!

    Developer Productivity Enhancements

    Analyze Crash Reports with Gemini in Android Studio

    Debugging production crashes can require you to spend significant time switching contexts between your crash reporting tool, such as Firebase Crashlytics and Android Vitals, and investigating root causes in the IDE. Now, when viewing reports in App Quality Insights (AQI), click the Insights tab. Gemini provides a summary of the crash, generates insights, and links to useful documentation. If you also provide Gemini with access to local code context, it can provide more accurate results, relevant next steps, and code suggestions. This helps you reduce the time spent diagnosing and resolving issues.

    moving image of Gemini in the App Quality Insights tool window in Android Studio

    Gemini helps you investigate, understand, and resolve crashes in your app much more quickly in the App Quality Insights tool window.

    Generate Unit Test Scenarios with Gemini

    Writing effective unit tests is crucial but can be time-consuming. Gemini now helps kickstart this process by generating relevant test scenarios. Right-click on a class in your editor and select Gemini > Generate Unit Test Scenarios. Gemini analyzes the code and suggests test cases with descriptive names, outlining what to test. While you still implement the specific test logic, this significantly speeds up the initial setup and ensures better test coverage by suggesting scenarios you might have missed.

    moving image of generating unit test scenarios in Android Studio

    Gemini helps you generate unit test scenarios for your app.

    Gemini Prompt Library

    No more retyping your most frequently used prompts for Gemini! The new Prompt Library lets you save prompts directly within Android Studio (Settings > Gemini > Prompt Library). Whether it’s a specific code generation pattern, a refactoring instruction, or a debugging query you use often, save it once from the chat (right-click > Save prompt) and re-apply it instantly from the editor (right-click > Gemini > Prompt Library). Prompts that you save can also be shared and standardized across your team.

    moving image of prompt library in Android Studio

    The prompt library saves your frequently used Gemini prompts to make them easier to use.

    You have the option to store prompts on IDE level or Project level:

      • IDE level prompts are private and can be used across multiple projects.
      • Project level prompts can be shared across teams working on the same project (if .idea folder is added to VCS).

    Compose and UI Development

    Themed Icon Support Preview

    Ensure your app’s branding looks great with Android’s themed icons. Android Studio now lets you preview how your existing launcher icon adapts to the monochromatic theming algorithm directly within the IDE. This quick visual check helps you identify potential contrast issues or undesirable shapes early in the workflow, even before you provide a dedicated monochromatic drawable. This allows for faster iteration on your app’s visual identity.

    moving image of themed icon support in preview in Android Studio

    Themed icon support in Preview helps you visually check how your existing launcher icon adapts to monochromatic theming.

    Compose Preview Enhancements

    Iterating on your Compose UI is now faster and better organized:

      • Enhanced Zoom: Navigate complex layouts more easily with smoother, more responsive zooming in your Compose previews.
      • Collapsible Groups: Tidy up your preview surface by collapsing groups of related composables under their @Preview annotation names, letting you focus on specific parts of the UI without clutter.
      • Grid Mode by Default: Grid mode is now the default for a clear overview. Gallery mode (for flipping through individual previews) is available via right-click, while List view has been removed to streamline the experience.

    moving image of Compose previews in Android Studio

    Compose previews render more smoothly and make it easier to hide previews you’re not focused on.

    Build and Deploy

    KMP Shared Module Integration

    Android Studio now streamlines adding shared logic to your Android app with the new Kotlin Multiplatform Shared Module template. This provides a dedicated starting point within your Android project, making it easier to structure and build shared business logic for both Android and iOS directly from Android Studio.

    Kotlin Multiplatform template in Android Studio

    The new Kotlin Multiplatform module template makes it easier to add shared business logic to your existing app.

    Updated UX for Adding Devices

    Spend less time configuring test devices. The new Device Manager UX for adding virtual and remote devices makes it much easier to configure the devices you want from the Device Manager. To get started, click the ‘+’ action at the top of the window and select one of these options:

      • Create Virtual Device: New filters, recommendations, and creation flow guide you towards creating AVDs that are best suited for your intended purpose and your machine’s performance.
      • Add Remote Devices: With Android Device Streaming, powered by Firebase, you can connect and debug your app with a variety of real physical devices. With a new catalog view and filters, it’s now easier to locate and start using the device you need in just a few clicks.

    moving image of configuring virtual devices in Android Studio

    It’s now easier to configure virtual devices that are optimized for your workstation.

    Google Play Deprecated SDK Warnings

    Stay more informed about SDKs you publish with your app. Android Studio now displays warnings from the Google Play SDK Index when an SDK used in your app has been deprecated by its author. These warnings include information about suggested alternative SDKs, helping you proactively manage dependencies and avoid potential issues related to outdated or insecure libraries.

    Google Play Deprecated SDK warnings in Android Studio

    Play deprecated SDK warnings help you avoid potential issues related to outdated or insecure libraries.

    Updated Build Menu and Actions

    We’ve refined the Build menu for a more intuitive experience:

      • New ‘Build run-configuration-name’ Action: Builds the currently selected run configuration (e.g., :app or a specific test). This is now the default action for the toolbar button and Control/Command+F9.
      • Reordered Actions: The new build action is prioritized at the top, followed by Compile and Assemble actions.
      • Clearer Naming: “Rebuild Project” is now “Clean and Assemble Project with Tests”. “Make Project” is renamed to “Assemble Project”, and a new “Assemble Project with Tests” action is available.

    Build menu in Android Studio

    The Build menu includes behavior and naming changes to simplify and streamline the experience.

    Standardized Config Directories

    Switching between Stable, Beta, and Canary versions of Android Studio is now smoother. Configuration directories are standardized, removing the “Preview” suffix for non-stable builds. We’ve also added the micro version (e.g., AndroidStudio2024.3.2) to the path, allowing different feature drops to run side-by-side without conflicts. This simplifies managing your IDE settings, especially if you work with multiple Android Studio installations.

    IntelliJ platform update

    Android Studio Meerkat Feature Drop (2024.3.2) includes the IntelliJ 2024.3 platform release, which has many new features such as a feature complete K2 mode, more reliable Java** and Kotlin code inspections, grammar checks during indexing, debugger improvements, speed and quality of life improvements to Terminal, and more.

    For more information, read the full IntelliJ 2024.3 release notes.

    Summary

    Android Studio Meerkat Feature Drop (2024.3.2) delivers these key features and enhancements:

      • Developer Productivity:
          • Analyze Crash Reports with Gemini
          • Generate Unit Test Scenarios with Gemini
          • Gemini Prompt Library
      • Compose and UI:
          • Themed Icon Preview
          • Compose Preview Enhancements (Zoom, Collapsible Groups, View Modes)
      • Build and Deploy:
          • KMP Shared Module Template
          • Updated UX for Adding Devices
          • Google Play SDK Insights: Deprecated SDK Warnings
          • Updated Build Menu & Actions
          • Standardized Config Directories
      • IntelliJ Platform Update
          • Feature complete K2 mode
          • Improved Kotlin and Java** inspection reliability
          • Debugger improvements
          • Speed and quality of life improvements in Terminal

    Getting Started

    Ready to elevate your Android development? Download Android Studio Meerkat Feature Drop and start using these powerful new features today!

    As always, your feedback is crucial. Check known issues, report bugs, suggest improvements, and connect with the community on LinkedIn, Medium, YouTube, or X. Let’s continue building amazing Android apps together!

    **Java is a trademark or registered trademark of Oracle and/or its affiliates.





    Source link

  • Concurrency & Networking in Android

    Concurrency & Networking in Android


    Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive
    catalogue of 50+ books and 4,000+ videos.

    Learn more

    © 2025 Kodeco Inc



    Source link

  • There’s good and bad news about the Z Fold and Flip 7 batteries- Android Authority

    There’s good and bad news about the Z Fold and Flip 7 batteries- Android Authority


    The Samsung Galaxy Z Flip 6 and Z Fold 6 on a table.

    Hadlee Simons / Android Authority

    TL;DR

    • The batteries for the Galaxy Z Fold 7 and Galaxy Z Flip 7 have received UL Demko certification.
    • The Z Fold 7 would have a total battery capacity of 4,272mAh, while the Z Flip 7 gets 4,174mAh.
    • Both devices may have 25W wireless charging speeds, up from the 15W of previous generations.

    As we get closer to summer, Samsung’s next generation of foldables is looming just over the horizon. We’re anticipating Samsung’s next Galaxy Unpacked event in the first half of July, which may be held in New York for the first time in three years. Here, we should see the Galaxy Z Fold 7 and Z Flip 7 devices, and leaks continue to give us a good idea of what to expect.

    What appear to be the batteries for both the Galaxy Z Fold 7 and Z Flip 7 have received UL Demko certification, which follows their earlier BIS certification, according to TheTechOutlook. Because of this, we now have some solid expectations for the capacities of both batteries.

    For the Galaxy Z Fold 7, we’re looking at possible battery model numbers of EB-BF966ABE and EB-BF967ABE, while these got certificate numbers of DK–163799-UL and DK–163657-UL. These are Li-ion batteries with capacities of 2,126mAh and 2,146mAh, which means 4,272mAh total for the rated capacity. As a comparison, the Galaxy Z Fold 6 packs in 2,355mAh and 1,918mAH batteries, which brings its rated total to 4,273mAh. In terms of marketing, since the Z Fold 6 has a typical 4,400mAh capacity, we should expect something similar for the Z Fold 7 as well.

    Regarding the Z Flip 7, we’ve got model numbers EB-BF766ABE and EB-BF767ABE for the potential batteries here, with certification numbers DK–163399-UL and DK–163928-UL. On this one, the capacities of the batteries are 1,189mAh and 2,985mAh, which would be a total of 4,174mAh. For reference, the Galaxy Z Flip 6’s components were rated at 2,790mAh and 1,097mAh, which is a total of 3,887mAh capacity. The typical capacity for the Z Flip 6 is 4,000mAh, so Samsung may be thinking of positioning this be as 4,300mAh for the Z Flip 7.

    From these new certification listings, those who prefer the larger Galaxy Z Fold series could  see a negligible drop in battery capacity, while the Z Flip fans are likely due a more substantial increase. Of course, actual battery life depends on what you do with your device all day, so these numbers may or may not have a big impact. We’ll find out when the phones launch and we try them out ourselves.

    But there is some good news for both, thankfully. It appears the the next-generation of foldables should support 25W wireless charging, according to their listings in China’s 3C certification database as spotted by TheTechOutlook. However, we also saw that both the Z Fold 7 and Flip 7 might only have 25W wired charging speeds as well, which isn’t as impressive as some of Samsung’s other flagships, and even mid-range devices with 45W.

    We also expect the Galaxy Z Fold 7 and Flip 7 to have Snapdragon 8 Elite SoC and at least 12GB RAM. With just a couple more months before the release of Samsung’s next-generation foldables, we shouldn’t have a much longer wait and will likely see plenty more leaks in the coming weeks.

    Got a tip? Talk to us! Email our staff at news@androidauthority.com. You can stay anonymous or get credit for the info, it’s your choice.



    Source link

  • Android User Interface Development | Kodeco

    Android User Interface Development | Kodeco


    Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive
    catalogue of 50+ books and 4,000+ videos.

    Learn more

    © 2025 Kodeco Inc



    Source link

  • Create a Multiscreen App in Android

    Create a Multiscreen App in Android


    This module covers essential navigation concepts and techniques in Jetpack Compose. You will create a multiscreen movie booking app,
    exploring the Navigation Component, passing arguments between screens, and sharing data with external apps. The course progresses
    to advanced topics like implementing deep links and setting up a bottom navigation bar. Throughout the module, you gain hands-on
    experience with key navigation elements such as the navigation graph, controller, and host, while also learning to handle
    arguments and create a seamless user experience. By the end, you will have a comprehensive understanding of how to implement
    efficient navigation in Jetpack Compose applications.



    Source link

  • Beginning Android & Kotlin | Kodeco

    Beginning Android & Kotlin | Kodeco


    We understand that circumstances can change, and if you need to withdraw from the bootcamp, your options will vary depending on your billing cycle:

    – If you enrolled with a monthly plan, you can cancel your future billing with your membership and you will not be renewed on your next billing date OR you can pause your membership for up to three months, then you can pick up your studies again at that time.

    – If you enrolled with a one-time payment, you will be eligible for a full refund within the first 14 days of your enrollment into the bootcamp.

    *Please note: if you’ve accessed a significant portion of program materials, this might affect your eligibility for a full refund.

    Please email support@kodeco.com for further assistance on the withdrawal process.



    Source link