Windows Server Setup Guide with PowerShell

Windows Server Setup Guide with PowerShell

In one of our meetings with Javier, we were discussing how many servers we have in the office. In the end, it turned out that we have a lot of servers, both Windows and Linux. So we decided to take a look and see what is running on each of the servers. A lot of those servers are actually test servers used to test deployments or to show something to a customer. A few of them were just full of examples or basically nothing, so I decided to format them and rebuild the installation.

I decided to start with the Windows server, so this post is going to be about the tools that I use to set up the Windows 2016 server.

There were a few tasks that I needed to accomplish that I usually do using the UI, and most of them are a pain in the ****. There is no other way to describe it, so I decided to create scripts instead so I can replicate it easily between servers.

Disable Internet Explorer Enhanced Security

The first task that I do when I set up a Windows Server is to disable Internet Explorer Enhanced Security. If you have used that type of security before, basically it means that you need to allow or whitelist every URL in the page that you’re browsing and the related pages in the page that you’re browsing. So it’s like 100 clicks per page. To remove the enhanced security, you need to go to the Windows features and turn it off there and then restart. So I created a script that does that for me. In one click, I can just disable the security so I can use Internet Explorer to actually download something newer like Microsoft Edge.

Disable internet explorer Enhanced Security

Set Up Web Server Role

The next step after disabling the enhanced security is to set up the Web Server role in Windows Server. This doesn’t come out of the box; you have to actually add the role to the server. For that, I will use another script, and the script will also install Web Deploy, which is the functionality that allows you to do remote deploying into an IIS server.

Setup Web Server Role

Fix Web Deploy Permissions (Optional)

Now here is an extra step. This step is optional. For some reason, in some of the servers, even though you have a clean installation, there will be a problem setting up the Web Deploy functionality. It’s a permission problem basically. So there is a script to fix that. You have to run the first script that installs the Web Server and the remote Web Deploy functionality. This is optional; you should use it only in case your Web Deploy doesn’t work.

Fix web deploy permissions

Set Up SQL Server Express

The next step for setting up the server is to set up SQL Server Express, and I have a script for that. I will paste it here, but for some reason, the script always fails in the way that you download and try to deploy it on the server. What happens is that the process always gets busy and the files get locked. So I will have to come back to that one later, but I will post the script here just to remember it.

SqlServer Express install

Enable Remote SQL Server Access (Optional)

OK, the next script is also optional. In our test servers, we usually allow remote access to the SQL Server database because we need to either restore a backup or create a database. For this, we need to do two things: open the firewall port for the database and also enable TCP connection from the outside. So here is a script for that too.

Enable Remote SQL Server Access

So that will be it for this post. Let me know which script you would like to have to automate your server setup.

Setting Up WSL 2: My Development Environment Scripts

Setting Up WSL 2: My Development Environment Scripts

After a problematic Windows update on my Surface computer that prevented me from compiling .NET applications, I spent days trying various fixes without success. Eventually, I had to format my computer and start fresh. This meant setting up everything again – Visual Studio, testing databases, and all the other development tools.To make future setups easier, I created a collection of WSL 2 scripts that automate the installation of tools I frequently use, like PostgreSQL and MySQL for testing purposes. While these scripts contain some practices that wouldn’t be recommended for production (like hardcoded passwords), they’re specifically designed for testing environments. The passwords used are already present in the sync framework source code, so there’s no additional security risk.I decided to share these scripts not as a perfect solution, but as a starting point for others who might need to set up similar testing environments. You can use them as inspiration for your own scripts or modify the default passwords to match your needs.

Note that these are specifically for testing purposes – particularly for working with the sync framework – and the hardcoded credentials should never be used in a production environment.

https://github.com/egarim/MyWslScripts

LDAP Scripts

MyWslScripts/ldap-setup.sh at master · egarim/MyWslScripts

MyWslScripts/add-ldap-user.sh at master · egarim/MyWslScripts

MySQL

MyWslScripts/install_mysql.sh at master · egarim/MyWslScripts

Postgres

MyWslScripts/install_postgres.sh at master · egarim/MyWslScripts

Redis

MyWslScripts/redis-install.sh at master · egarim/MyWslScripts

Let me know if you’d like me to share the actual scripts in a follow-up post!
Understanding System Abstractions for LLM Integration

Understanding System Abstractions for LLM Integration

I’ve been thinking about this topic for a while and have collected numerous notes and ideas about how to present abstractions that allow large language models (LLMs) to interact with various systems – whether that’s your database, operating system, word documents, or other applications.

Before diving deeper, let’s review some fundamental concepts:

Key Concepts

First, let’s talk about APIs (Application Programming Interface). In simple terms, an API is a way to expose methods, functions, and procedures from your application, independent of the programming language being used.

Next is the REST API concept, which is a method of exposing your API using HTTP verbs. As IT professionals, we hear these terms – HTTP, REST, API – almost daily, but we might not fully grasp their core concepts. Let me explain how they relate to software automation using AI.

HTTP (Hypertext Transfer Protocol) is fundamentally a way for two applications to communicate using text. This is its beauty – text serves as the basic layer of understanding between systems, meaning almost any system or programming language can produce a client or server that can interact via HTTP.

REST (Representational State Transfer) is a methodology for systems to communicate and either change or read the state of another system.

Levels of System Interaction

When implementing LLMs for system automation, we first need to determine our desired level of interaction. Here are several approaches:

  1. Human-like Interaction: An LLM can interact with your operating system using mouse and keyboard inputs, effectively mimicking human behavior.
  2. REST API Integration: Your application can communicate using HTTP verbs and the REST protocol.
  3. SDK Implementation: You can create a software development kit that describes your application’s functionality and expose this to the LLM.

The connection method will vary depending on your chosen technology. For instance:

  • Microsoft Semantic Kernel allows you to create plugins that interact with your system through REST API, database, or SDK.
  • Microsoft AI extensions require you to decide on your preferred interaction level before implementation.
  • The Model Context Protocol is a newer approach that enables application exposure for LLM agents, with Claude from Anthropic being a notable example.

Implementation Considerations

When automating your system, you need to consider:

  1. Available Integration Options: Not all systems provide an SDK or API, which can limit automation possibilities.
  2. Interaction Protocol Choice: You’ll need to decide between REST API, HTTP, or Model Context Protocol.

This overview should help you understand the various levels of resolution needed to automate your application. What’s your preferred method for integrating LLMs with your applications? I’d love to hear your thoughts and experiences.

Bridging Traditional Development using XAF and AI: Training Sessions in Cairo

Bridging Traditional Development using XAF and AI: Training Sessions in Cairo

I recently had the privilege of conducting a training session in Cairo, Egypt, focusing on modern application development approaches. The session covered two key areas that are transforming how we build business applications: application frameworks and AI integration.

Streamlining Development with Application Frameworks

One of the highlights was demonstrating DevExpress’s eXpressApp Framework (XAF). The students were particularly impressed by how quickly we could build fully-functional Line of Business (LOB) applications. XAF’s approach eliminates much of the repetitive coding typically associated with business application development:

  • Automatic CRUD operations
  • Built-in security system
  • Consistent UI across different platforms
  • Rapid prototyping capabilities

Seamless Integration: XAF Meets Microsoft Semantic Kernel

What made this training unique was demonstrating how XAF’s capabilities extend into AI territory. We built the entire AI interface using XAF itself, showcasing how a traditional LOB framework can seamlessly incorporate advanced AI features. The audience, coming primarily from JavaScript backgrounds with Angular and React experience, was particularly impressed by how this approach simplified the integration of AI into business applications.

During the demonstrations, we explored practical implementations using Microsoft Semantic Kernel. The students were fascinated by practical demonstrations of:

  • Natural language processing for document analysis
  • Automated content generation for business documentation
  • Intelligent decision support systems
  • Context-aware data processing

Student Engagement and Outcomes

The response from the students, most of whom came from JavaScript development backgrounds, was overwhelmingly positive. As experienced frontend developers using Angular and React, they were initially skeptical about a different approach to application development. However, their enthusiasm peaked when they saw how these technologies could solve real business challenges they face daily. The combination of XAF’s rapid development capabilities and Semantic Kernel’s AI features, all integrated into a cohesive development experience, opened their eyes to new possibilities in application development.

Looking Forward

This training session in Cairo demonstrated the growing appetite for modern development approaches in the region. The intersection of efficient application frameworks and AI capabilities is proving to be a powerful combination for next-generation business applications.

And last, but not least, some pictures )))

 

 

Hard to Kill: Why Auto-Increment Primary Keys Can Make Data Sync Die Harder

Hard to Kill: Why Auto-Increment Primary Keys Can Make Data Sync Die Harder

Working with the SyncFramework, I’ve noticed a recurring pattern when discussing schema design with customers. One crucial question that often surprises them is about their choice of primary keys: “Are you using auto-incremental integers or unique identifiers (like GUIDs)?”

Approximately 90% of users rely on auto-incremental integer primary keys. While this seems like a straightforward choice, it can create significant challenges for data synchronization. Let’s dive deep into how different database engines handle auto-increment values and why this matters for synchronization scenarios.

Database Implementation Deep Dive

SQL Server

SQL Server uses the IDENTITY property, storing current values in system tables (sys.identity_columns) and caching them in memory for performance. During restarts, it reads the last used value from these system tables. The values are managed as 8-byte numbers internally, with new ranges allocated when the cache is exhausted.

MySQL

MySQL’s InnoDB engine maintains auto-increment counters in memory and persists them to the system tablespace or table’s .frm file. After a restart, it scans the table to find the maximum used value. Each table has its own counter stored in the metadata.

PostgreSQL

PostgreSQL takes a different approach, using separate sequence objects stored in the pg_class catalog. These sequences maintain their own relation files containing crucial metadata like last value, increment, and min/max values. The sequence data is periodically checkpointed to disk for durability.

Oracle

Oracle traditionally uses sequences and triggers, with modern versions (12c+) supporting identity columns. The sequence information is stored in the SEQ$ system table, tracking the last number used, cache size, and increment values.

The Synchronization Challenge

This diversity in implementation creates several challenges for data synchronization:

  1. Unpredictable Sequence Generation: Even within the same database engine, gaps can occur due to rolled-back transactions or server restarts.
  2. Infrastructure Dependencies: The mechanisms for generating next values are deeply embedded within each database engine and aren’t easily accessible to frameworks like Entity Framework or XPO.
  3. Cross-Database Complexity: When synchronizing across different database instances, coordinating auto-increment values becomes even more complex.

The GUID Alternative

Using GUIDs (Globally Unique Identifiers) as primary keys offers a solution to these synchronization challenges. While GUIDs come with their own set of considerations, they provide guaranteed uniqueness across distributed systems without requiring centralized coordination.

Traditional GUID Concerns

  • Index fragmentation
  • Storage size
  • Performance impact

Modern Solutions

These concerns have been addressed through:

  • Sequential GUID generation techniques
  • Improved indexing in modern databases
  • Optimizations in .NET 9

Recommendations

When designing systems that require data synchronization:

  1. Consider using GUIDs instead of auto-increment integers for primary keys
  2. Evaluate sequential GUID generation for better performance
  3. Understand that auto-increment values, while simple, can complicate synchronization scenarios
  4. Plan for the infrastructure needed to maintain consistent primary key generation across your distributed system

Conclusion

The choice of primary key strategy significantly impacts your system’s ability to handle data synchronization effectively. While auto-increment integers might seem simpler at first, understanding their implementation details across different databases reveals why GUIDs often provide a more robust solution for distributed systems.

Remember: Data synchronization is not a trivial problem, and your primary key strategy plays a crucial role in its success. Take the time to evaluate your requirements and choose the appropriate approach for your specific use case.

Till next time, happy delta encoding.

 
SyncFramework for XPO: Updated for .NET 8 & 9  and DevExpress 24.2.3!

SyncFramework for XPO: Updated for .NET 8 & 9 and DevExpress 24.2.3!

SyncFramework for XPO is a specialized implementation of our delta encoding synchronization library, designed specifically for DevExpress XPO users. It enables efficient data synchronization by tracking and transmitting only the changes between data versions, optimizing both bandwidth usage and processing time.

What’s New

  • Base target framework updated to .NET 8.0
  • Added compatibility with .NET 9.0
  • Updated DevExpress XPO dependencies to 24.2.3
  • Continued support for delta encoding synchronization
  • Various performance improvements and bug fixes

Framework Compatibility

  • Primary Target: .NET 8.0
  • Additional Support: .NET 9.0

Our XPO implementation continues to serve the DevExpress community.

Key Features

  • Seamless integration with DevExpress XPO
  • Efficient delta-based synchronization
  • Support for multiple database providers
  • Cross-platform compatibility
  • Easy integration with existing XPO and XAF applications

As always, if you own a license, you can compile the source code yourself from our GitHub repository. The framework maintains its commitment to providing reliable data synchronization for XPO applications.

Happy Delta Encoding! ?

 

SyncFramework Update: Now Supporting .NET 9 and EfCore 9!

SyncFramework Update: Now Supporting .NET 9 and EfCore 9!

SyncFramework Update: Now Supporting .NET 9!

SyncFramework is a C# library that simplifies data synchronization using delta encoding technology. Instead of transferring entire datasets, it efficiently synchronizes by tracking and transmitting only the changes between data versions, significantly reducing bandwidth and processing overhead.

What’s New

  • All packages now target .NET 9
  • BIT.Data.Sync packages updated to support the latest framework
  • Entity Framework Core packages upgraded to EF Core 9
  • Various minor fixes and improvements

Available Implementations

  • SyncFramework for XPO: For DevExpress XPO users
  • SyncFramework for Entity Framework Core: For EF Core users

Package Statistics

Our packages have been serving the community well, with steady adoption:

  • BIT.Data.Sync: 2,142 downloads
  • BIT.Data.Sync.AspNetCore: 1,064 downloads
  • BIT.Data.Sync.AspNetCore.Xpo: 521 downloads
  • BIT.Data.Sync.EfCore: 1,691 downloads
  • BIT.Data.Sync.EfCore.Npgsql: 1,120 downloads
  • BIT.Data.Sync.EfCore.Pomelo.MySql: 1,172 downloads
  • BIT.Data.Sync.EfCore.Sqlite: 887 downloads
  • BIT.Data.Sync.EfCore.SqlServer: 982 downloads

Resources

NuGet Packages
Source Code

As always, you can compile the source code yourself from our GitHub repository. The framework continues to provide reliable data synchronization across different platforms and databases.

Happy Delta Encoding! ?

Say my name: The Evolution of Shared Libraries

Say my name: The Evolution of Shared Libraries

During my recent AI research break, I found myself taking a walk down memory lane, reflecting on my early career in data analysis and ETL operations. This journey brought me back to an interesting aspect of software development that has evolved significantly over the years: the management of shared libraries.

The VB6 Era: COM Components and DLL Hell

My journey began with Visual Basic 6, where shared libraries were managed through COM components. The concept seemed straightforward: store shared DLLs in the Windows System directory (typically C:\Windows\System32) and register them using regsvr32.exe. The Windows Registry kept track of these components under HKEY_CLASSES_ROOT.

However, this system had a significant flaw that we now famously know as “DLL Hell.” Let me share a practical example: Imagine you have two systems, A and B, both using Crystal Reports 7. If you uninstall either system, the other would break because the shared DLL would be removed. Version control was primarily managed by location, making it a precarious system at best.

Enter .NET Framework: The GAC Revolution

When Microsoft introduced the .NET Framework, it brought a sophisticated solution to these problems: the Global Assembly Cache (GAC). Located at C:\Windows\Microsoft.NET\assembly\ (for .NET 4.0 and later), the GAC represented a significant improvement in shared library management.

The most revolutionary aspect was the introduction of assembly identity. Instead of relying solely on filenames and locations, each assembly now had a unique identity consisting of:

  • Simple name (e.g., “MyCompany.MyLibrary”)
  • Version number (e.g., “1.0.0.0”)
  • Culture information
  • Public key token

A typical assembly full name would look like this:

MyCompany.MyLibrary, Version=1.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089

This robust identification system meant that multiple versions of the same assembly could coexist peacefully, solving many of the versioning nightmares that plagued the VB6 era.

The Modern Approach: Private Dependencies

Fast forward to 2025, and we’re living in what I call the “brave new world” of .NET for multi-operative systems. The landscape has changed dramatically. Storage is no longer the premium resource it once was, and the trend has shifted away from shared libraries toward application-local deployment.

Modern applications often ship with their own private version of the .NET runtime and dependencies. This approach eliminates the risks associated with shared components and gives applications complete control over their runtime environment.

Reflection on Technology Evolution

While researching Blazor’s future and seeing discussions about Microsoft’s technology choices, I’m reminded that technology evolution is a constant journey. Organizations move slowly in production environments, and that’s often for good reason. The shift from COM components to GAC to private dependencies wasn’t just a technical evolution – it was a response to real-world problems and changing resources.

This journey from VB6 to modern .NET reveals an interesting pattern: sometimes the best solution isn’t sharing resources but giving each application its own isolated environment. It’s fascinating how the decreasing cost of storage and increasing need for reliability has transformed our approach to dependency management.

As I return to my AI research, this trip down memory lane serves as a reminder that while technology constantly evolves, understanding its history helps us appreciate the solutions we have today and better prepare for the challenges of tomorrow.

Happy coding!
ADOMD.NET: Beyond Rows and Columns – The Multidimensional Evolution of ADO.NET

ADOMD.NET: Beyond Rows and Columns – The Multidimensional Evolution of ADO.NET

When I first encountered the challenge of migrating hundreds of Visual Basic 6 reports to .NET, I never imagined it would lead me down a path of discovering specialized data analytics tools. Today, I want to share my experience with ADOMD.NET and how it could have transformed our reporting challenges, even though we couldn’t implement it due to our database constraints.

The Challenge: The Sales Gap Report

The story begins with a seemingly simple report called “Sales Gap.” Its purpose was critical: identify periods when regular customers stopped purchasing specific items. For instance, if a customer typically bought 10 units monthly from January to May, then suddenly stopped in June and July, sales representatives needed to understand why.

This report required complex queries across multiple transactional tables:

  • Invoicing
  • Sales
  • Returns
  • Debits
  • Credits

Initially, the report took about a minute to run. As our data grew, so did the execution time—eventually reaching an unbearable 15 minutes. We were stuck with a requirement to use real-time transactional data, making traditional optimization techniques like data warehousing off-limits.

Enter ADOMD.NET: A Specialized Solution

ADOMD.NET (ActiveX Data Objects Multidimensional .NET) emerged as a potential solution. Here’s why it caught my attention:

Key Features:

  1. Multidimensional Analysis

    Unlike traditional SQL queries, ADOMD.NET uses MDX (Multidimensional Expressions), specifically designed for analytical queries. Here’s a basic example:

    string mdxQuery = @"
        SELECT 
            {[Measures].[Sales Amount]} ON COLUMNS,
            {[Date].[Calendar Year].MEMBERS} ON ROWS
        FROM [Sales Cube]
        WHERE [Product].[Category].[Electronics]";
  2. Performance Optimization

    ADOMD.NET is built for analytical workloads, offering better performance for complex calculations and aggregations. It achieves this through:

    • Specialized data structures for multidimensional analysis
    • Efficient handling of hierarchical data
    • Built-in support for complex calculations
  3. Advanced Analytics Capabilities

    The tool supports sophisticated analysis patterns like:

    string mdxQuery = @"
        WITH MEMBER [Measures].[GrowthVsPreviousYear] AS
            ([Measures].[Sales Amount] - 
            ([Measures].[Sales Amount], [Date].[Calendar Year].PREVMEMBER)
            )/([Measures].[Sales Amount], [Date].[Calendar Year].PREVMEMBER)
        SELECT 
            {[Measures].[Sales Amount], [Measures].[GrowthVsPreviousYear]} 
        ON COLUMNS...";

Lessons Learned

While we couldn’t implement ADOMD.NET due to our use of Pervasive Database instead of SQL Server, the investigation taught me valuable lessons about report optimization:

  1. The importance of choosing the right tools for analytical workloads
  2. The limitations of running complex analytics on transactional databases
  3. The value of specialized query languages for different types of data analysis

Modern Applications

Today, ADOMD.NET continues to be relevant for organizations using:

  • SQL Server Analysis Services (SSAS)
  • Azure Analysis Services
  • Power BI Premium datasets

If I were facing the same challenge today with SQL Server, ADOMD.NET would be my go-to solution for:

  • Complex sales analysis
  • Customer behavior tracking
  • Performance-intensive analytical reports

Conclusion

While our specific situation with Pervasive Database prevented us from using ADOMD.NET, it remains a powerful tool for organizations using Microsoft’s analytics stack. The experience taught me that sometimes the solution isn’t about optimizing existing queries, but about choosing the right specialized tools for analytical workloads.

Remember: Just because you can run analytics on your transactional database doesn’t mean you should. Tools like ADOMD.NET exist for a reason, and understanding when to use them can save countless hours of optimization work and provide better results for your users.

 

Back to the Future of Dev Tools: DevExpress CLI templates

Back to the Future of Dev Tools: DevExpress CLI templates

My mom used to say that fashion is cyclical – whatever you do will eventually come back around. I’ve come to realize the same principle applies to technology. Many technologies have come and gone, only to resurface again in new forms.

Take Command Line Interface (CLI) commands, for example. For years, the industry pushed to move away from CLI towards graphical interfaces, promising a more user-friendly experience. Yet here we are in 2025, witnessing a remarkable return to CLI-based tools, especially in software development.

As a programmer, efficiency is key – particularly when dealing with repetitive tasks. This became evident when my business partner Javier and I decided to create our own application templates for Visual Studio. The process was challenging, mainly because Visual Studio’s template infrastructure isn’t well maintained. Documentation was sparse, and the whole process felt cryptic.

Our first major project was creating a template for Xamarin.Forms (now .NET MAUI), aiming to build a multi-target application template that could work across Android, iOS, and Windows. We relied heavily on James Montemagno’s excellent resources and videos to navigate this complex territory.

The task became significantly easier with the introduction of the new SDK-style projects. Compared to the older MSBuild project types, which were notoriously complex to template, the new format makes creating custom project templates much more straightforward.

In today’s development landscape, most application templates are distributed as NuGet packages, making them easier to share and implement. Interestingly, these packages are primarily designed for CLI use rather than Visual Studio’s graphical interface – a perfect example of technology coming full circle.

Following this trend, DevExpress has developed a new set of application templates that work cross-platform using the CLI. These templates leverage SkiaSharp for UI rendering, enabling true multi-IDE and multi-OS compatibility. While they’re not yet compatible with Apple Silicon, that support is likely coming in future updates.

The templates utilize CLI under the hood to generate new project structures. When you install these templates in Visual Studio Code or Visual Studio, they become available through both the CLI and the graphical interface, offering developers the best of both worlds.

Here is the official DevExpress blog post for the new application templates

https://www.devexpress.com/subscriptions/whats-new/#project-template-gallery-net8

Templates for Visual Studio

DevExpress Template Kit for Visual Studio – Visual Studio Marketplace

Templates for VS Code

DevExpress Template Kit for VS Code – Visual Studio Marketplace

If you want to see the list of the new installed DevExpress templates, you can use the following command on the terminal

dotnet new list dx

 

I’d love to hear your thoughts on this technological cycle. Which approach do you prefer for creating new projects – CLI or graphical interface? Let me know in the comments below!