Understanding the Document Module: Day 2 – The Foundation of a Financial Accounting System

Understanding the Document Module: Day 2 – The Foundation of a Financial Accounting System

Introduction

In financial accounting systems, the document module serves as the cornerstone upon which all other functionality is built. Just as physical documents form the basis of traditional accounting practices, the digital document module provides the foundation for recording, processing, and analyzing financial transactions. In this article, we’ll explore the structure and importance of the document module in a modern financial accounting system.

The Core Components

The document module consists of three essential components:

1. Documents

Documents represent the source records of financial events. These might include invoices, receipts, bank statements, journal entries, and various specialized financial documents like balance transfer statements and closing entries. Each document contains metadata such as:

  • Date of the document
  • Document number/reference
  • Description and comments
  • Document type classification
  • Audit information (who created/modified it and when)

Documents serve as the legal proof of financial activities and provide an audit trail that can be followed to verify the accuracy and validity of financial records.

2. Transactions

Transactions represent the financial impact of documents in the general ledger. While a document captures the business event (e.g., an invoice), the transaction represents how that event affects the company’s financial position. A single document may generate one or more transactions depending on its complexity.

Each transaction is linked to its parent document and contains:

  • Transaction date (which may differ from the document date)
  • Description
  • Reference to the parent document

Transactions bridge the gap between source documents and ledger entries, maintaining the relationship between business events and their financial representations.

3. Ledger Entries

Ledger entries are the individual debit and credit entries that make up a transaction. They represent the actual changes to account balances in the general ledger. Each ledger entry contains:

  • Reference to the parent transaction
  • Account identifier
  • Entry type (debit or credit)
  • Amount
  • Optional references to persons and cost centers for analytical purposes

Ledger entries implement the double-entry accounting principle, ensuring that for every transaction, debits equal credits.

Why This Modular Approach Matters

The document module’s structure offers several significant advantages:

1. Separation of Concerns

By separating documents, transactions, and ledger entries, the system maintains clear boundaries between:

  • Business events (documents)
  • Financial impacts (transactions)
  • Specific account changes (ledger entries)

This separation allows each layer to focus on its specific responsibilities without being overly coupled to other components.

2. Flexibility and Extensibility

The modular design allows for adding new document types without changing the core accounting logic. Whether handling standard invoices or specialized financial instruments, the same underlying structure applies, making the system highly extensible.

3. Robust Audit Trail

With documents serving as the origin of all financial records, the system maintains a complete audit trail. Every ledger entry can be traced back to its transaction and originating document, providing accountability and transparency.

4. Compliance and Reporting

The document-centric approach aligns with legal and regulatory requirements that mandate keeping original document records. This structure facilitates regulatory compliance and simplifies financial reporting.

Implementation Considerations

When implementing a document module, several design principles should be considered:

Interface-Based Design

Using interfaces like IDocument, ITransaction, and ILedgerEntry promotes flexibility and testability. Services operate against these interfaces rather than concrete implementations, following the Dependency Inversion Principle.

Immutability of Processed Documents

Once a document has been processed and its transactions recorded, changes should be restricted to prevent inconsistencies. Any modifications should follow proper accounting procedures, such as creating correction entries.

Versioning and Historical Records

The system should maintain historical versions of documents, especially when they’re modified, to preserve the accurate history of financial events.

Conclusion

The document module serves as the backbone of a financial accounting system, providing the structure and organization needed to maintain accurate financial records. By properly implementing this foundation, accounting systems can ensure data integrity, regulatory compliance, and flexible business operations.

Understanding the document module’s architecture helps developers and accountants alike appreciate the careful design considerations that go into building robust financial systems capable of handling the complexities of modern business operations.

Repo

egarim/SivarErp: Open Source ERP

About Us

YouTube

https://www.youtube.com/c/JocheOjedaXAFXAMARINC

Our sites
Let’s discuss your XAF

This call/zoom will give you the opportunity to define the roadblocks in your current XAF solution. We can talk about performance, deployment or custom implementations. Together we will review you pain points and leave you with recommendations to get your app back in track

https://calendly.com/bitframeworks/bitframeworks-free-xaf-support-hour

Our free A.I courses on Udemy
Building an Agnostic ERP System: Day 1 – Core Architecture

Building an Agnostic ERP System: Day 1 – Core Architecture

After returning home from an extended journey through the United States, Greece, and Turkey, I found myself contemplating a common challenge over my morning coffee. There are numerous recurring problems in system design and ORM (Object-Relational Mapping) implementation that developers face repeatedly.

To address these challenges, I’ve decided to tackle a system that most professionals are familiar with—an ERP (Enterprise Resource Planning) system—and develop a design that achieves three critical goals:

  1. Performance Speed: The system must be fast and responsive
  2. Technology Agnosticism: The architecture should be platform-independent
  3. Consistent Performance: The system should maintain its performance over time

Design Decisions

To achieve these goals, I’m implementing the following key design decisions:

  • Utilizing the SOLID design principles to ensure maintainability and extensibility
  • Building with C# and net9 to leverage its modern language features
  • Creating an agnostic architecture that can be reimplemented in various technologies like DevExpress XAF or Entity Framework

Day 1: Foundational Structure

In this first article, I’ll propose an initial folder structure that may evolve as the system develops. I’ll also describe a set of base classes and interfaces that will form the foundation of our system.

You can find all the source code for this solution in the designated repository.

The Core Layer

Today we’re starting with the core layer—a set of interfaces that most entities will implement. The system design follows SOLID principles to ensure it can be easily reimplemented using different technologies.

Base Interfaces

Here’s the foundation of our interface hierarchy:

  • IEntity: Core entity interface defining the Id property
  • IAuditable: Interface for entities with audit information
  • IArchivable: Interface for entities supporting soft delete
  • IVersionable: Interface for entities with effective dating
  • ITimeTrackable: Interface for entities requiring time tracking

Service Interfaces

To complement our entity interfaces, we’re also defining service interfaces:

  • IAuditService: Interface for audit-related operations
  • IArchiveService: Interface for archiving operations

 

Repo

egarim/SivarErp: Open Source ERP

Next Steps

In upcoming articles, I’ll expand on this foundation by implementing concrete classes, developing the domain layer, and demonstrating how this architecture can be applied to specific ERP modules.

The goal is to create a reference architecture that addresses the recurring challenges in system design while remaining adaptable to different technological implementations.

Stay tuned for the next installment where we’ll dive deeper into the implementation details of our core interfaces.

About Us

 

YouTube

 

https://www.youtube.com/c/JocheOjedaXAFXAMARINC

Our sites

 

Let’s discuss your XAF

 

This call/zoom will give you the opportunity to define the roadblocks in your current XAF solution. We can talk about performance, deployment or custom implementations. Together we will review you pain points and leave you with recommendations to get your app back in track

 

https://calendly.com/bitframeworks/bitframeworks-free-xaf-support-hour

 

Our free A.I courses on Udemy
The Anatomy of an Uno Platform Solution

The Anatomy of an Uno Platform Solution

It’s been almost a month since I left home to attend the Microsoft MVP Summit in Seattle. I’m still on the road, currently in Athens, Greece, with numerous notes for upcoming articles. While traveling makes writing challenging, I want to maintain the order of my Uno Platform series to ensure everything makes sense for readers.

In this article, we’ll dive into the structure of an Uno Platform solution. There’s some “black magic” happening behind the scenes, so understanding how everything works will make development significantly easier.

What is Uno Platform?

Before we dive into the anatomy, let’s briefly explain what Uno Platform is. Uno Platform is an open-source framework that enables developers to build cross-platform applications from a single codebase. Using C# and XAML, you can create applications that run on Windows, iOS, Android, macOS, Linux, and WebAssembly.

Root Solution Structure

An Uno Platform solution follows a specific structure that facilitates cross-platform development. Let’s break down the key components:

Main (and only) Project

The core of an Uno Platform solution is the main shared project (in our example, “UnoAnatomy”). This project contains cross-platform code shared across all target platforms and includes:

  • Assets: Contains shared resources like images and icons used across all platforms. These assets may be adapted for different screen densities and platforms as needed.
  • Serialization: Here is where the JsonSerializerContext lives, Since .NET 6 serialization context allows controlling how objects are serialized through the JsonSerializerContext class. It provides ahead-of-time metadata generation for better performance and reduces reflection usage, particularly beneficial for AOT compilation scenarios like Blazor WebAssembly and native apps.
  • Models: Contains business model classes representing core domain entities in your application.
  • Presentation: Holds UI components including pages, controls, and views. This typically includes files like Shell.xaml.cs and MainPage.xaml.cs that implement the application’s UI elements and layout.
  • Platforms:
    • Android: Contains the Android-specific entry point (MainActivity.Android.cs) and any other Android-specific configurations or code.
    • iOS: Contains the iOS-specific entry point (Main.iOS.cs).
    • MacCatalyst: Contains the MacCatalyst-specific entry point (Main.maccatalyst.cs).
    • BrowserWasm: Contains the Browser WASM specific configurations or code.
    • Desktop: Contains the Desktop specific configurations or code.
  • Services: Contains service classes implementing business logic, data access, etc. This folder often includes subfolders like:
  • Strings: the purpose of this folder is to store the localized string resources for the application so it can be translated to multiple languages.
  • Styles: this folder contains the styles or color configuration for the app.

Build Configuration Files

Several build configuration files in the root of the solution control the build process:

  • Directory.Build.props: Contains global MSBuild properties applied to all projects.
  • Directory.Build.targets: Contains global MSBuild targets for all projects.
  • Directory.Packages.props: Centralizes package versions for dependency management.
  • global.json: Specifies the Uno.SDK version and other .NET SDK configurations.

The Power of Uno.Sdk

One of the most important aspects of modern Uno Platform development is the Uno.Sdk, which significantly simplifies the development process.

What is Uno.Sdk?

Uno.Sdk is a specialized MSBuild SDK that streamlines Uno Platform development by providing:

  1. A cross-platform development experience that simplifies targeting multiple platforms from a single project
  2. Automatic management of platform-specific dependencies and configurations
  3. A simplified build process that handles the complexity of building for different target platforms
  4. Feature-based configuration that enables adding functionality through the UnoFeatures property

In your project file, you’ll see <Project Sdk="Uno.Sdk"> at the top, indicating that this project uses the Uno SDK rather than the standard .NET SDK.

Key Components of the Project File

TargetFrameworks

<TargetFrameworks>net9.0-android;net9.0-ios;net9.0-maccatalyst;net9.0-windows10.0.26100;net9.0-browserwasm;net9.0-desktop</TargetFrameworks>

This line specifies that your application targets:

  • Android
  • iOS
  • macOS (via Mac Catalyst)
  • Windows (Windows 10/11 with SDK version 10.0.26100)
  • WebAssembly (for browser-based applications)
  • Desktop (for cross-platform desktop applications)

All of these targets use .NET 9 as the base framework.

Single Project Configuration

<OutputType>Exe</OutputType>
<UnoSingleProject>true</UnoSingleProject>
  • OutputType: Specifies this project builds an executable application
  • UnoSingleProject: Enables Uno’s single-project approach, allowing you to maintain one codebase for all platforms

Application Metadata

<ApplicationTitle>UnoAnatomy</ApplicationTitle>
<ApplicationId>com.companyname.UnoAnatomy</ApplicationId>
<ApplicationDisplayVersion>1.0</ApplicationDisplayVersion>
<ApplicationVersion>1</ApplicationVersion>
<ApplicationPublisher>joche</ApplicationPublisher>
<Description>UnoAnatomy powered by Uno Platform.</Description>

These properties define your app’s identity and metadata used in app stores and installation packages.

UnoFeatures

The most powerful aspect of Uno.Sdk is the UnoFeatures property:

<UnoFeatures>
  Material;
  Dsp;
  Hosting;
  Toolkit;
  Logging;
  Mvvm;
  Configuration;
  Http;
  Serialization;
  Localization;
  Navigation;
  ThemeService;
</UnoFeatures>

This automatically adds relevant NuGet packages for each listed feature:

  • Material: Material Design UI components
  • Dsp: Digital Signal Processing capabilities
  • Hosting: Dependency injection and host builder pattern
  • Toolkit: Community Toolkit components
  • Logging: Logging infrastructure
  • Mvvm: Model-View-ViewModel pattern implementation
  • Configuration: Application configuration framework
  • Http: HTTP client capabilities
  • Serialization: Data serialization/deserialization
  • Localization: Multi-language support
  • Navigation: Navigation services
  • ThemeService: Dynamic theme support

The UnoFeatures property eliminates the need to manually add numerous NuGet packages and ensures compatibility between components.

Benefits of the Uno Platform Structure

This structured approach to cross-platform development offers several advantages:

  1. Code Sharing: Most code is shared across platforms, reducing duplication and maintenance overhead.
  2. Platform-Specific Adaptation: When needed, the structure allows for platform-specific implementations.
  3. Simplified Dependencies: The Uno.Sdk handles complex dependency management behind the scenes.
  4. Consistent Experience: Ensures a consistent development experience across all target platforms.
  5. Future-Proofing: The architecture makes it easier to add support for new platforms in the future.

Conclusion

Understanding the anatomy of an Uno Platform solution is crucial for effective cross-platform development. The combination of shared code, platform-specific heads, and the powerful Uno.Sdk creates a development experience that makes it much easier to build and maintain applications across multiple platforms from a single codebase.

By leveraging this structure and the features provided by the Uno Platform, you can focus on building your application’s functionality rather than dealing with the complexities of cross-platform development.

In my next article in this series, we’ll dive deeper into the practical aspects of developing with Uno Platform, exploring how to leverage these structural components to build robust cross-platform applications.

 

Related articles

Getting Started with Uno Platform: First Steps and Configuration Choices | Joche Ojeda

My Adventures Picking a UI Framework: Why I Chose Uno Platform | Joche Ojeda

Exploring the Uno Platform: Handling Unsafe Code in Multi-Target Applications | Joche Ojeda

About Us

YouTube

https://www.youtube.com/c/JocheOjedaXAFXAMARINC

Our sites

Let’s discuss your XAF

This call/zoom will give you the opportunity to define the roadblocks in your current XAF solution. We can talk about performance, deployment or custom implementations. Together we will review you pain points and leave you with recommendations to get your app back in track

https://calendly.com/bitframeworks/bitframeworks-free-xaf-support-hour

Our free A.I courses on Udemy

Running Docker on a Windows Surface ARM64 Using WSL2

Running Docker on a Windows Surface ARM64 Using WSL2

My Docker Adventure in Athens

Hello fellow tech enthusiasts!

I’m currently in Athens, Greece, enjoying a lovely Easter Sunday, when I decided to tackle a little tech project – getting Docker running on my Microsoft Surface with an ARM64 CPU. If you’ve ever tried to do this, you might know it’s not as straightforward as it sounds!

After some research, I discovered something important: there’s a difference between Docker Enterprise and Docker Community Edition (CE). While the enterprise version doesn’t support ARM64 yet, Docker CE does have versions for both ARM64 and x64 architectures. Perfect!

The WSL2 Solution

I initially tried to install Docker directly on Windows, but quickly ran into roadblocks. That’s when I decided to try the Windows Subsystem for Linux (WSL2) route instead. Spoiler alert: it worked like a charm!

While you won’t get the nice Docker Desktop UI that Windows users might be accustomed to, the command line interface through WSL2 works perfectly fine. After all, Docker was born on Linux, so running it in a Linux environment makes sense!

Step-by-Step Guide to Installing Docker CE on WSL2

Here’s how I got Docker CE up and running on my Surface using WSL2:

Step 1: Update Your Packages

First, make sure your WSL2 system is up to date:

sudo apt update && sudo apt upgrade -y

Step 2: Install Required Packages

Install the necessary packages to use HTTPS repositories:

sudo apt install -y apt-transport-https ca-certificates curl software-properties-common gnupg lsb-release

Step 3: Add Docker’s Official GPG Key

curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg

Step 4: Set Up the Stable Docker Repository

echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null

Step 5: Update APT with the New Repository

sudo apt update

Step 6: Install Docker CE

sudo apt install -y docker-ce docker-ce-cli containerd.io

Step 7: Start the Docker Service

sudo service docker start

Step 8: Add Your User to the Docker Group

This allows you to run Docker without sudo:

sudo usermod -aG docker $USER

Step 9: Apply the Group Changes

Either log out and back in, or run:

newgrp docker

Step 10: Verify Your Installation

docker --version
docker run hello-world

Pro Tip!

If you want Docker to start automatically when you launch WSL2, add the service start command to your .bashrc or .zshrc file:

echo "sudo service docker start" >> ~/.bashrc

Final Thoughts

What started as a potentially frustrating experience turned into a surprisingly smooth process. WSL2 continues to impress me with how well it bridges the Windows and Linux worlds. If you have a Surface or any other ARM64-based Windows device and need to run Docker, I highly recommend the WSL2 approach.

Have you tried running Docker on an ARM device? What was your experience like? Let me know in the comments below!

Happy containerizing! 🐳