The Anatomy of an Uno Platform Solution

The Anatomy of an Uno Platform Solution

It’s been almost a month since I left home to attend the Microsoft MVP Summit in Seattle. I’m still on the road, currently in Athens, Greece, with numerous notes for upcoming articles. While traveling makes writing challenging, I want to maintain the order of my Uno Platform series to ensure everything makes sense for readers.

In this article, we’ll dive into the structure of an Uno Platform solution. There’s some “black magic” happening behind the scenes, so understanding how everything works will make development significantly easier.

What is Uno Platform?

Before we dive into the anatomy, let’s briefly explain what Uno Platform is. Uno Platform is an open-source framework that enables developers to build cross-platform applications from a single codebase. Using C# and XAML, you can create applications that run on Windows, iOS, Android, macOS, Linux, and WebAssembly.

Root Solution Structure

An Uno Platform solution follows a specific structure that facilitates cross-platform development. Let’s break down the key components:

Main (and only) Project

The core of an Uno Platform solution is the main shared project (in our example, “UnoAnatomy”). This project contains cross-platform code shared across all target platforms and includes:

  • Assets: Contains shared resources like images and icons used across all platforms. These assets may be adapted for different screen densities and platforms as needed.
  • Serialization: Here is where the JsonSerializerContext lives, Since .NET 6 serialization context allows controlling how objects are serialized through the JsonSerializerContext class. It provides ahead-of-time metadata generation for better performance and reduces reflection usage, particularly beneficial for AOT compilation scenarios like Blazor WebAssembly and native apps.
  • Models: Contains business model classes representing core domain entities in your application.
  • Presentation: Holds UI components including pages, controls, and views. This typically includes files like Shell.xaml.cs and MainPage.xaml.cs that implement the application’s UI elements and layout.
  • Platforms:
    • Android: Contains the Android-specific entry point (MainActivity.Android.cs) and any other Android-specific configurations or code.
    • iOS: Contains the iOS-specific entry point (Main.iOS.cs).
    • MacCatalyst: Contains the MacCatalyst-specific entry point (Main.maccatalyst.cs).
    • BrowserWasm: Contains the Browser WASM specific configurations or code.
    • Desktop: Contains the Desktop specific configurations or code.
  • Services: Contains service classes implementing business logic, data access, etc. This folder often includes subfolders like:
  • Strings: the purpose of this folder is to store the localized string resources for the application so it can be translated to multiple languages.
  • Styles: this folder contains the styles or color configuration for the app.

Build Configuration Files

Several build configuration files in the root of the solution control the build process:

  • Directory.Build.props: Contains global MSBuild properties applied to all projects.
  • Directory.Build.targets: Contains global MSBuild targets for all projects.
  • Directory.Packages.props: Centralizes package versions for dependency management.
  • global.json: Specifies the Uno.SDK version and other .NET SDK configurations.

The Power of Uno.Sdk

One of the most important aspects of modern Uno Platform development is the Uno.Sdk, which significantly simplifies the development process.

What is Uno.Sdk?

Uno.Sdk is a specialized MSBuild SDK that streamlines Uno Platform development by providing:

  1. A cross-platform development experience that simplifies targeting multiple platforms from a single project
  2. Automatic management of platform-specific dependencies and configurations
  3. A simplified build process that handles the complexity of building for different target platforms
  4. Feature-based configuration that enables adding functionality through the UnoFeatures property

In your project file, you’ll see <Project Sdk="Uno.Sdk"> at the top, indicating that this project uses the Uno SDK rather than the standard .NET SDK.

Key Components of the Project File

TargetFrameworks

<TargetFrameworks>net9.0-android;net9.0-ios;net9.0-maccatalyst;net9.0-windows10.0.26100;net9.0-browserwasm;net9.0-desktop</TargetFrameworks>

This line specifies that your application targets:

  • Android
  • iOS
  • macOS (via Mac Catalyst)
  • Windows (Windows 10/11 with SDK version 10.0.26100)
  • WebAssembly (for browser-based applications)
  • Desktop (for cross-platform desktop applications)

All of these targets use .NET 9 as the base framework.

Single Project Configuration

<OutputType>Exe</OutputType>
<UnoSingleProject>true</UnoSingleProject>
  • OutputType: Specifies this project builds an executable application
  • UnoSingleProject: Enables Uno’s single-project approach, allowing you to maintain one codebase for all platforms

Application Metadata

<ApplicationTitle>UnoAnatomy</ApplicationTitle>
<ApplicationId>com.companyname.UnoAnatomy</ApplicationId>
<ApplicationDisplayVersion>1.0</ApplicationDisplayVersion>
<ApplicationVersion>1</ApplicationVersion>
<ApplicationPublisher>joche</ApplicationPublisher>
<Description>UnoAnatomy powered by Uno Platform.</Description>

These properties define your app’s identity and metadata used in app stores and installation packages.

UnoFeatures

The most powerful aspect of Uno.Sdk is the UnoFeatures property:

<UnoFeatures>
  Material;
  Dsp;
  Hosting;
  Toolkit;
  Logging;
  Mvvm;
  Configuration;
  Http;
  Serialization;
  Localization;
  Navigation;
  ThemeService;
</UnoFeatures>

This automatically adds relevant NuGet packages for each listed feature:

  • Material: Material Design UI components
  • Dsp: Digital Signal Processing capabilities
  • Hosting: Dependency injection and host builder pattern
  • Toolkit: Community Toolkit components
  • Logging: Logging infrastructure
  • Mvvm: Model-View-ViewModel pattern implementation
  • Configuration: Application configuration framework
  • Http: HTTP client capabilities
  • Serialization: Data serialization/deserialization
  • Localization: Multi-language support
  • Navigation: Navigation services
  • ThemeService: Dynamic theme support

The UnoFeatures property eliminates the need to manually add numerous NuGet packages and ensures compatibility between components.

Benefits of the Uno Platform Structure

This structured approach to cross-platform development offers several advantages:

  1. Code Sharing: Most code is shared across platforms, reducing duplication and maintenance overhead.
  2. Platform-Specific Adaptation: When needed, the structure allows for platform-specific implementations.
  3. Simplified Dependencies: The Uno.Sdk handles complex dependency management behind the scenes.
  4. Consistent Experience: Ensures a consistent development experience across all target platforms.
  5. Future-Proofing: The architecture makes it easier to add support for new platforms in the future.

Conclusion

Understanding the anatomy of an Uno Platform solution is crucial for effective cross-platform development. The combination of shared code, platform-specific heads, and the powerful Uno.Sdk creates a development experience that makes it much easier to build and maintain applications across multiple platforms from a single codebase.

By leveraging this structure and the features provided by the Uno Platform, you can focus on building your application’s functionality rather than dealing with the complexities of cross-platform development.

In my next article in this series, we’ll dive deeper into the practical aspects of developing with Uno Platform, exploring how to leverage these structural components to build robust cross-platform applications.

 

Related articles

Getting Started with Uno Platform: First Steps and Configuration Choices | Joche Ojeda

My Adventures Picking a UI Framework: Why I Chose Uno Platform | Joche Ojeda

Exploring the Uno Platform: Handling Unsafe Code in Multi-Target Applications | Joche Ojeda

About Us

YouTube

https://www.youtube.com/c/JocheOjedaXAFXAMARINC

Our sites

Let’s discuss your XAF

This call/zoom will give you the opportunity to define the roadblocks in your current XAF solution. We can talk about performance, deployment or custom implementations. Together we will review you pain points and leave you with recommendations to get your app back in track

https://calendly.com/bitframeworks/bitframeworks-free-xaf-support-hour

Our free A.I courses on Udemy

Understanding CPU Translation Layers: ARM to x86/x64 for Windows, macOS, and Linux

Understanding CPU Translation Layers: ARM to x86/x64 for Windows, macOS, and Linux

As technology continues to evolve, the need for seamless interoperability between different hardware architectures becomes increasingly crucial. One significant aspect of this interoperability is the ability to run software compiled for one CPU architecture on another. This blog post explores how CPU translation layers enable the execution of ARM-compiled applications on x86/x64 platforms across Windows, macOS, and Linux.

Windows OS: Bridging ARM and x86/x64

Microsoft’s approach to running ARM applications on x86/x64 hardware is embodied in Windows 10 on ARM. This system allows ARM-based devices to run Windows efficiently, incorporating several key technologies:

  • WOW (Windows on Windows): This subsystem provides compatibility for 32-bit x86 applications on ARM devices through a mix of emulation and native execution.
  • x86/x64 Emulation: Windows 10 and 11 on ARM can emulate both x86 and x64 applications. The emulation layer dynamically translates x86/x64 instructions to ARM instructions at runtime, using Just-In-Time (JIT) compilation techniques to convert code as it is needed.
  • Native ARM64 Support: To avoid the performance overhead associated with emulation, Microsoft encourages developers to compile their applications directly for ARM64.

macOS: The Power of Rosetta 2

Apple’s transition from Intel (x86/x64) to Apple Silicon (ARM) has been facilitated by Rosetta 2, a sophisticated translation layer designed to make this process as smooth as possible:

  • Dynamic Binary Translation: Rosetta 2 converts x86_64 instructions to ARM instructions on-the-fly, enabling users to run x86_64 applications transparently on ARM-based Macs.
  • Ahead-of-Time (AOT) Compilation: For some applications, Rosetta 2 can pre-translate x86_64 binaries to ARM before execution, boosting performance.
  • Universal Binaries: Apple encourages developers to use Universal Binaries, which include both x86_64 and ARM64 executables, allowing the operating system to select the appropriate version based on the hardware.

Linux: Flexibility with QEMU

Linux’s open-source nature provides a versatile approach to CPU translation through QEMU, a widely-used emulator that supports various architectures, including ARM to x86/x64:

  • User-mode Emulation: QEMU can run individual Linux executables compiled for ARM on an x86/x64 host by translating system calls and CPU instructions.
  • Full-system Emulation: It can also emulate a complete ARM system, enabling an x86/x64 machine to run an ARM operating system and its applications.
  • Performance Enhancements: QEMU’s performance can be significantly improved with KVM (Kernel-based Virtual Machine), which allows near-native execution speed for guest instructions.

How Translation Layers Work

The translation process involves several steps to ensure smooth execution of applications across different architectures:

  1. Instruction Fetch: The emulator fetches instructions from the source (ARM) binary.
  2. Instruction Decode: The fetched instructions are decoded into a format understandable by the translation layer.
  3. Instruction Translation:
    • JIT Compilation: Converts source instructions into target (x86/x64) instructions in real-time.
    • Caching: Frequently used translations are cached to avoid repeated translation.
  4. Execution: The translated instructions are executed on the target CPU.
  5. System Calls and Libraries:
    • System Call Translation: System calls from the source architecture are translated to their equivalents on the host architecture.
    • Library Mapping: Shared libraries from the source architecture are mapped to their counterparts on the host system.

Performance Considerations

  • Overhead: Emulation introduces overhead, which can impact performance, particularly for compute-intensive applications.
  • Optimization Strategies: Techniques like ahead-of-time compilation, caching, and promoting native support help mitigate performance penalties.
  • Hardware Support: Some ARM processors include hardware extensions to accelerate binary translation.

Developer Considerations

For developers, ensuring compatibility and performance across different architectures involves several best practices:

  • Cross-Compilation: Developers should compile their applications for multiple architectures to provide native performance on each platform.
  • Extensive Testing: Applications must be tested thoroughly in both native and emulated environments to ensure compatibility and performance.

Conclusion

CPU translation layers are pivotal for maintaining software compatibility across different hardware architectures. By leveraging sophisticated techniques such as dynamic binary translation, JIT compilation, and system call translation, these layers bridge the gap between ARM and x86/x64 architectures on Windows, macOS, and Linux. As technology continues to advance, these translation layers will play an increasingly important role in enabling seamless interoperability across diverse computing environments.