by Joche Ojeda | Jan 20, 2025 | ADO, ADO.NET, Database, dotnet
When I first encountered the challenge of migrating hundreds of Visual Basic 6 reports to .NET, I never imagined it would lead me down a path of discovering specialized data analytics tools. Today, I want to share my experience with ADOMD.NET and how it could have transformed our reporting challenges, even though we couldn’t implement it due to our database constraints.
The Challenge: The Sales Gap Report
The story begins with a seemingly simple report called “Sales Gap.” Its purpose was critical: identify periods when regular customers stopped purchasing specific items. For instance, if a customer typically bought 10 units monthly from January to May, then suddenly stopped in June and July, sales representatives needed to understand why.
This report required complex queries across multiple transactional tables:
- Invoicing
- Sales
- Returns
- Debits
- Credits
Initially, the report took about a minute to run. As our data grew, so did the execution time—eventually reaching an unbearable 15 minutes. We were stuck with a requirement to use real-time transactional data, making traditional optimization techniques like data warehousing off-limits.
Enter ADOMD.NET: A Specialized Solution
ADOMD.NET (ActiveX Data Objects Multidimensional .NET) emerged as a potential solution. Here’s why it caught my attention:
Key Features:
-
Multidimensional Analysis
Unlike traditional SQL queries, ADOMD.NET uses MDX (Multidimensional Expressions), specifically designed for analytical queries. Here’s a basic example:
string mdxQuery = @"
SELECT
{[Measures].[Sales Amount]} ON COLUMNS,
{[Date].[Calendar Year].MEMBERS} ON ROWS
FROM [Sales Cube]
WHERE [Product].[Category].[Electronics]";
-
Performance Optimization
ADOMD.NET is built for analytical workloads, offering better performance for complex calculations and aggregations. It achieves this through:
- Specialized data structures for multidimensional analysis
- Efficient handling of hierarchical data
- Built-in support for complex calculations
-
Advanced Analytics Capabilities
The tool supports sophisticated analysis patterns like:
string mdxQuery = @"
WITH MEMBER [Measures].[GrowthVsPreviousYear] AS
([Measures].[Sales Amount] -
([Measures].[Sales Amount], [Date].[Calendar Year].PREVMEMBER)
)/([Measures].[Sales Amount], [Date].[Calendar Year].PREVMEMBER)
SELECT
{[Measures].[Sales Amount], [Measures].[GrowthVsPreviousYear]}
ON COLUMNS...";
Lessons Learned
While we couldn’t implement ADOMD.NET due to our use of Pervasive Database instead of SQL Server, the investigation taught me valuable lessons about report optimization:
- The importance of choosing the right tools for analytical workloads
- The limitations of running complex analytics on transactional databases
- The value of specialized query languages for different types of data analysis
Modern Applications
Today, ADOMD.NET continues to be relevant for organizations using:
- SQL Server Analysis Services (SSAS)
- Azure Analysis Services
- Power BI Premium datasets
If I were facing the same challenge today with SQL Server, ADOMD.NET would be my go-to solution for:
- Complex sales analysis
- Customer behavior tracking
- Performance-intensive analytical reports
Conclusion
While our specific situation with Pervasive Database prevented us from using ADOMD.NET, it remains a powerful tool for organizations using Microsoft’s analytics stack. The experience taught me that sometimes the solution isn’t about optimizing existing queries, but about choosing the right specialized tools for analytical workloads.
Remember: Just because you can run analytics on your transactional database doesn’t mean you should. Tools like ADOMD.NET exist for a reason, and understanding when to use them can save countless hours of optimization work and provide better results for your users.
by Joche Ojeda | Jan 15, 2025 | C#, dotnet, Emit, MetaProgramming, Reflection
Every programmer encounters that one technology that draws them into the darker arts of software development. For some, it’s metaprogramming; for others, it’s assembly hacking. For me, it was the mysterious world of runtime code generation through Emit in the early 2000s, during my adventures with XPO and the enigmatic Sage Accpac ERP.
The Quest Begins: A Tale of Documentation and Dark Arts
Back in the early 2000s, when the first version of XPO was released, I found myself working alongside my cousin Carlitos in our startup. Fresh from his stint as an ERP consultant in the United States, Carlitos brought with him deep knowledge of Sage Accpac, setting us on a path to provide integration services for this complex system.
Our daily bread and butter were custom reports – starting with Crystal Reports before graduating to DevExpress’s XtraReports and XtraPivotGrid. But we faced an interesting challenge: Accpac’s database was intentionally designed to resist reverse engineering, with flat tables devoid of constraints or relationships. All we had was their HTML documentation, a labyrinth of interconnected pages holding the secrets of their entity relationships.
Genesis: When Documentation Meets Dark Magic
This challenge birthed Project Genesis, my ambitious attempt to create an XPO class generator that could parse Accpac’s documentation. The first hurdle was parsing HTML – a quest that led me to CodePlex (yes, I’m dating myself here) and the discovery of HTMLAgilityPack, a remarkable tool that still serves developers today.
But the real dark magic emerged when I faced the challenge of generating classes dynamically. Buried in our library’s .NET books, I discovered the arcane art of Emit – a powerful technique for runtime assembly and class generation that would forever change my perspective on what’s possible in .NET.
Diving into the Abyss: Understanding Emit
At its core, Emit is like having a magical forge where you can craft code at runtime. Imagine being able to write code that writes more code – not just as text to be compiled later, but as actual, executable IL instructions that the CLR can run immediately.
AssemblyName assemblyName = new AssemblyName("DynamicAssembly");
AssemblyBuilder assemblyBuilder = AssemblyBuilder.DefineDynamicAssembly(
assemblyName,
AssemblyBuilderAccess.Run
);
This seemingly simple code opens a portal to one of .NET’s most powerful capabilities: dynamic assembly generation. It’s the beginning of a spell that allows you to craft types and methods from pure thought (and some carefully crafted IL instructions).
The Power and the Peril
Like all dark magic, Emit comes with its own dangers and responsibilities. When you’re generating IL directly, you’re dancing with the very fabric of .NET execution. One wrong move – one misplaced instruction – and your carefully crafted spell can backfire spectacularly.
The first rule of Emit Club is: don’t use Emit unless you absolutely have to. The second rule is: if you do use it, document everything meticulously. Your future self (and your team) will thank you.
Modern Alternatives and Evolution
Today, the .NET ecosystem offers alternatives like Source Generators that provide similar power with less risk. But understanding Emit remains valuable – it’s like knowing the fundamental laws of magic while using higher-level spells for daily work.
In my case, Project Genesis evolved beyond its original scope, teaching me crucial lessons about runtime code generation, performance optimization, and the delicate balance between power and maintainability.
Conclusion: The Magic Lives On
Twenty years later, Emit remains one of .NET’s most powerful and mysterious features. While modern development practices might steer us toward safer alternatives, understanding these fundamental building blocks of runtime code generation gives us deeper insight into the framework’s capabilities.
For those brave enough to venture into this realm, remember: with great power comes great responsibility – and the need for comprehensive unit tests. The dark magic of Emit might be seductive, but like all powerful tools, it demands respect and careful handling.
by Joche Ojeda | Jan 14, 2025 | C#, dotnet, MetaProgramming, Reflection
The Beginning of a Digital Sorcerer
Every master of the dark arts has an origin story, and mine begins in the ancient realm of MS-DOS 6.1. What started as simple experimentation with BAT files would eventually lead me down a path to discovering one of programming’s most powerful arts: metaprogramming.
I still remember the day my older brother Oscar introduced me to the mystical DIR
command. He was three years ahead of me in school, already initiated into the computer classes that would begin in “tercer ciclo” (7th through 9th grade) in El Salvador. This simple command, capable of revealing the contents of directories, was my first spell in what would become a lifelong pursuit of programming magic.
My childhood hobbies – playing video games, guitar, and piano (a family tradition, given my father’s musical lineage) – faded into the background as I discovered the enchanting world of DOS commands. The discovery that files ending in .exe
were executable spells and .com
files were commands that accepted parameters opened up a new realm of possibilities.
Armed with EDIT.COM
, a primitive but powerful text editor, I began experimenting with every file I could find. The real breakthrough came when I discovered AUTOEXEC.BAT
, a mystical scroll that controlled the DOS startup ritual. This was my first encounter with automated script execution, though I didn’t know it at the time.
The Path of Many Languages
My journey through the programming arts led me through many schools of magic: Turbo Pascal, C++, Fox Pro (more of an application framework than a pure language), Delphi, VB6, VBA, VB.NET, and finally, my true calling: C#.
During my university years, I co-founded my first company with my cousin “Carlitos,” supported by my uncle Carlos Melgar, who had been like a father to me. While we had some coding experience, our ambition to create our own ERP system led us to expand our circle. This is where I met Abel, one of two programmers we recruited who were dating my cousins at the time. Abel, coming from a Delphi background, introduced me to a concept that would change my understanding of programming forever: reflection.
Understanding the Dark Arts of Metaprogramming
What Abel revealed to me that day was just the beginning of my journey into metaprogramming, a form of magic that allows code to examine and modify itself at runtime. In the .NET realm, this sorcery primarily manifests through reflection, a power that would have seemed impossible in my DOS days.
Let me share with you the secrets I’ve learned along this path:
The Power of Reflection: Your First Spell
// A basic spell of introspection
Type stringType = typeof(string);
MethodInfo[] methods = stringType.GetMethods();
foreach (var method in methods)
{
Console.WriteLine($"Discovered spell: {method.Name}");
}
This simple incantation allows your code to examine itself, revealing the methods hidden within any type. But this is just the beginning.
Conjuring Objects from the Void
As your powers grow, you’ll learn to create objects dynamically:
public class ObjectConjurer
{
public T SummonAndEnchant<T>(Dictionary<string, object> properties) where T : new()
{
T instance = new T();
Type type = typeof(T);
foreach (var property in properties)
{
PropertyInfo prop = type.GetProperty(property.Key);
if (prop != null && prop.CanWrite)
{
prop.SetValue(instance, property.Value);
}
}
return instance;
}
}
Advanced Rituals: Expression Trees
Expression<Func<int, bool>> ageCheck = age => age >= 18;
var parameter = Expression.Parameter(typeof(int), "age");
var constant = Expression.Constant(18, typeof(int));
var comparison = Expression.GreaterThanOrEqual(parameter, constant);
var lambda = Expression.Lambda<Func<int, bool>>(comparison, parameter);
The Price of Power: Security and Performance
Like any powerful magic, these arts come with risks and costs. Through my journey, I learned the importance of protective wards:
Guarding Against Dark Forces
// A protective ward for your reflective operations
[SecurityPermission(SecurityAction.Demand, ControlEvidence = true)]
public class SecretKeeper
{
private readonly string _arcaneSecret = "xyz";
public string RevealSecret(string authToken)
{
if (ValidateToken(authToken))
return _arcaneSecret;
throw new ForbiddenMagicException("Unauthorized attempt to access secrets");
}
}
The Cost of Power
Ritual Type |
Energy Cost (ms) |
Mana Usage |
Direct Cast |
1 |
Baseline |
Reflection |
10-20 |
2x-3x |
Cached Cast |
2-3 |
1.5x |
Compiled |
1.2-1.5 |
1.2x |
To mitigate these costs, I learned to cache my spells:
public class SpellCache
{
private static readonly ConcurrentDictionary<string, MethodInfo> SpellBook
= new ConcurrentDictionary<string, MethodInfo>();
public static MethodInfo GetSpell(Type type, string spellName)
{
string key = $"{type.FullName}.{spellName}";
return SpellBook.GetOrAdd(key, _ => type.GetMethod(spellName));
}
}
Practical Applications in the Modern Age
Today, these dark arts power many of our most powerful frameworks:
- Entity Framework uses reflection for its magical object-relational mapping
- Dependency Injection containers use it to automatically wire up our applications
- Serialization libraries use it to transform objects into different forms
- Unit testing frameworks use it to create test doubles and verify behavior
Wisdom for the Aspiring Sorcerer
From my journey from DOS batch files to the heights of .NET metaprogramming, I’ve gathered these pieces of wisdom:
- Cache your incantations whenever possible
- Guard your secrets with proper wards
- Measure the cost of your rituals
- Use direct casting when available
- Document your dark arts thoroughly
Conclusion
Looking back at my journey from those first DOS commands to mastering the dark arts of metaprogramming, I’m reminded that every programmer’s path is unique. That young boy who first typed DIR
in MS-DOS could never have imagined where that path would lead. Today, as I work with advanced concepts like reflection and metaprogramming in .NET, I’m reminded that our field is one of continuous learning and evolution.
The dark arts of metaprogramming may be powerful, but like any tool, their true value lies in knowing when and how to use them effectively. Remember, while the ability to make code write itself might seem like sorcery, the real magic lies in understanding the fundamentals and growing from them. Whether you’re starting with basic commands like I did or diving straight into advanced concepts, every step of the journey contributes to your growth as a developer.
And who knows? Maybe one day you’ll find yourself teaching these dark arts to the next generation of digital sorcerers.
by Joche Ojeda | Jan 9, 2025 | dotnet
While researching useful features in .NET 9 that could benefit XAF/XPO developers, I discovered something particularly interesting: Version 7 GUIDs (RFC 9562 specification). These new GUIDs offer a crucial feature – they’re sortable.
This discovery brought me back to an issue I encountered two years ago while working on the SyncFramework. We faced a peculiar problem where Deltas were correctly generated but processed in the wrong order in production environments. The occurrences seemed random, and no clear pattern emerged. Initially, I thought using Delta primary keys (GUIDs) to sort the Deltas would ensure they were processed in their generation order. However, this assumption proved incorrect. Through testing, I discovered that GUID generation couldn’t be trusted to be sequential. This issue affected multiple components of the SyncFramework. Whether generating GUIDs in C# or at the database level, there was no guarantee of sequential ordering. Different database engines could sort GUIDs differently. To address this, I implemented a sequence service as a solution.Enter .NET 9 with its Version 7 GUIDs (conforming to RFC 9562 specification). These new GUIDs are genuinely sequential, making them reliable for sorting operations.
To demonstrate this improvement, I created a test solution for XAF with a custom base object. The key implementation occurs in the OnSaving method:
protected override void OnSaving()
{
base.OnSaving();
if (!(Session is NestedUnitOfWork) && Session.IsNewObject(this) && oid.Equals(Guid.Empty))
{
oid = Guid.CreateVersion7();
}
}
Notice the use of CreateVersion7()
instead of the traditional NewGuid()
. For comparison, I also created another domain object using the traditional GUID generation:
protected override void OnSaving()
{
base.OnSaving();
if (!(Session is NestedUnitOfWork) && Session.IsNewObject(this) && oid.Equals(Guid.Empty))
{
oid = Guid.NewGuid();
}
}
When creating multiple instances of the traditional GUID domain object, you’ll notice that the greater the time interval between instance creation, the less likely the GUIDs will maintain sequential ordering.
GUID Version 7

GUID Old Version

This new feature in .NET 9 could significantly simplify scenarios where sequential ordering is crucial, eliminating the need for additional sequence services in many cases. Here is the repo on GitHubHappy coding until next time!
Related article
On my GUID, common problems using GUID identifiers | Joche Ojeda
by Joche Ojeda | Jun 21, 2024 | Database, ORM
Why Compound Keys in Database Tables Are No Longer Valid
Introduction
In the realm of database design, compound keys were once a staple, largely driven by the need to adhere to normalization forms. However, the evolving landscape of technology and data management calls into question the continued relevance of these multi-attribute keys. This article explores the reasons why compound keys may no longer be the best choice and suggests a shift towards simpler, more maintainable alternatives like object identifiers (OIDs).
The Case Against Compound Keys
Complexity in Database Design
- Normalization Overhead: Historically, compound keys were used to satisfy normalization requirements, ensuring minimal redundancy and dependency. While normalization is still important, the rigidity it imposes can lead to overly complex database schemas.
- Business Logic Encapsulation: When compound keys include business logic, they can create dependencies that complicate data integrity and maintenance. Changes in business rules often necessitate schema alterations, which can be cumbersome.
Maintenance Challenges
- Data Integrity Issues: Compound keys can introduce challenges in maintaining data integrity, especially in large and complex databases. Ensuring the uniqueness and consistency of multi-attribute keys can be error-prone.
- Performance Concerns: Queries involving compound keys can become less efficient, as indexing and searching across multiple columns can be more resource-intensive compared to single-column keys.
The Shift Towards Object Identifiers (OIDs)
Simplified Design
- Single Attribute Keys: Using OIDs as primary keys simplifies the schema. Each row can be uniquely identified by a single attribute, making the design more straightforward and easier to understand.
- Decoupling Business Logic: OIDs help in decoupling the business logic from the database schema. Changes in business rules do not necessitate changes in the primary key structure, enhancing flexibility.
Easier Maintenance
- Improved Data Integrity: With a single attribute as the primary key, maintaining data integrity becomes more manageable. The likelihood of key conflicts is reduced, simplifying the validation process.
- Performance Optimization: OIDs allow for more efficient indexing and query performance. Searching and sorting operations are faster and less resource-intensive, improving overall database performance.
Revisiting Normalization
Historical Context
- Storage Constraints: Normalization rules were developed when data storage was expensive and limited. Reducing redundancy and optimizing storage was paramount.
- Modern Storage Solutions: Today, storage is relatively cheap and abundant. The strict adherence to normalization may not be as critical as it once was.
Balancing Act
- De-normalization for Performance: In modern databases, a balance between normalization and de-normalization can be beneficial. De-normalization can improve performance and simplify query design without significantly increasing storage costs.
- Practical Normalization: Applying normalization principles should be driven by practical needs rather than strict adherence to theoretical models. The goal is to achieve a design that is both efficient and maintainable.
ORM Design Preferences
Object-Relational Mappers (ORMs)
- Design with OIDs in Mind: Many ORMs, such as XPO from DevExpress, were originally designed to work with OIDs rather than compound keys. This preference simplifies database interaction and enhances compatibility with object-oriented programming paradigms.
- Support for Compound Keys: Although these ORMs support compound keys, their architecture and default behavior often favor the use of single-column OIDs, highlighting the practical advantages of simpler key structures in modern application development.
Conclusion
The use of compound keys in database tables, driven by the need to fulfill normalization forms, may no longer be the best practice in modern database design. Simplifying schemas with object identifiers can enhance maintainability, improve performance, and decouple business logic from the database structure. As storage becomes less of a constraint, a pragmatic approach to normalization, balancing performance and data integrity, becomes increasingly important. Embracing these changes, along with leveraging ORM tools designed with OIDs in mind, can lead to more robust, flexible, and efficient database systems.