DUID is a fully-featured replacement for GUIDs (Globally Unique Identifiers). They are more compact, web-friendly, and provide more entropy than GUIDs. We created this UUID type as a replacement for GUIDs in our projects, improving on several GUID shortcomings.
We pronounce it doo-id, but it can also be pronounced like dude, which is by design :)
You can use DUIDs as IDs for user accounts and database records, in JWTs, as unique code entity (e.g. variable) names, and more. They're an ideal replacement for GUIDs that need to be used in web scenarios.
Uses the latest .NET cryptographic random number generator
More entropy than GUID v4 (128 bits vs 122 bits)
No embedded timestamp (reduces predictability and improves strength)
Self-contained; does not use any packages
High performance, with minimal allocations
16 bytes in size; 22 characters as a string
Always starts with a letter (can be used as-is for programming language variable names)
URL-safe
Can be validated, parsed, and compared
Can be created from and converted to byte arrays
UTF-8 encoding support
JSON serialization support
TypeConverter support
Debug support (displays as string in the debugger)
Yes, you can also find DUID on nuget. Look for the package named fynydd.duid.
Similar to Guid.NewGuid(), you can generate a new DUID by calling the static NewDuid() method:
This will produce a new DUID, for example: aZ3x9Kf8LmN2QvW1YbXcDe. There are a ton of overloads and extension methods for converting, validating, parsing, and comparing DUIDs.
Here are some examples:
There is also a JSON converter for System.Text.Json that provides seamless serialization and deserialization of DUIDs:
There's usually more to the story so if you have questions or comments about this post let us know!
Do you need a new software development partner for an upcoming project? We would love to work with you! From websites and mobile apps to cloud services and custom software, we can help!
Sfumato is a pair of tools that generate CSS for your web-based projects. You can create HTML markup and use pre-defined utility class names to style the rendered markup, without actually writing any CSS code or leaving your HTML editor!
The resulting HTML markup is clean, readable, and consistent. And the generated CSS file is tiny, even if it hasn't been minified!
The first tool is the command line interface (CLI), which you can install once and use on any projects. It will watch/build CSS as you work.
The second tool is a Nuget package that you can add to a compatible .NET project. And after adding a snippet of code to your startup, Sfumato will build/watch as you run your app or debug, generating CSS on the fly.
Sfumato is compatible with the Tailwind CSS v4 class naming structure and has the following additional features:
Sfumato supports ASP.NET, Blazor, and other Microsoft stack projects by handling "@@" escapes in razor/cshtml markup files. So you can use arbitrary variants and container utilities like "@container" by escaping them in razor syntax (e.g. "@@container").
In addition to using the CLI tool to build and watch your project, you can instead add the Sfumato Core Nuget package to your project to have Sfumato build CSS as you debug, or when you build or publish.
Sfumato features can be used in imported CSS files without any modifications. It just works. Tailwind's Node.js pipeline requires additional changes to be made in imported CSS files that use Tailwind features and setup is finicky.
Unlike Tailwind, Sfumato allows you to provide "system", "light", and "dark" options in your web app without writing any JavaScript code (other than widget UI code).
In addition to the standard media breakpoint variants (e.g. sm, md, lg, etc.) Sfumato has adaptive breakpoints that use viewport aspect ratio for better device identification (e.g. mobi, tabp, tabl, desk, etc.).
Sfumato includes form field styles that are class name compatible with the Tailwind forms plugin.
The Sfumato color library provides 20 shade steps per color (values of 50-1000 in increments of 50).
Sfumato combines media queries (like dark theme styles), reducing the size of the generated CSS even without minification.
Sfumato supports redirected input for use in automation workflows.
There's usually more to the story so if you have questions or comments about this post let us know!
Do you need a new software development partner for an upcoming project? We would love to work with you! From websites and mobile apps to cloud services and custom software, we can help!
An Amazon Web Services (AWS) outage shut down a good portion of the Internet on Monday (October 20, 2025), affecting websites, apps, and services. Updates are still being released, but here's what we know so far.
At 3:00am on Monday morning there was a problem with one of the core AWS database products (DynamoDb), which knocked many of the leading apps, streaming services, and websites offline for millions of users across the globe.
The problem seems to have originated at one of the main AWS data centers in Ashburn, Virginia following a software update to the DynamoDB API, which is a cloud database service used by online platforms to store user and app data. This area of Virginia is known colloquially as Data Center Alley because it has the world's largest concentration of data centers.
The specific issue appears to be related to an error that occurred in the update which affected DynamoDb DNS. Domain Name Systems (DNS) are used to route domain name requests (e.g. fynydd.com) to their correct server IP addresses (e.g. 1.2.3.4). Since the service's domain names couldn't be matched with server IP addresses, the DynamoDb service itself could not be reached. Any apps or services that rely on DynamoDb would then experience intermittent connectivity issues or even complete outages.
Hundreds of companies have likely been affected worldwide. Some notable examples include:
Amazon
Apple TV
Chime
DoorDash
Fortnite
Hulu
Microsoft Teams
The New York Times
Netflix
Ring
Snapchat
T-Mobile
Verizon
Venmo
Zoom
In a statement, the company said: “All AWS Services returned to normal operations at 3:00pm. Some services such as AWS Config, Redshift, and Connect continue to have a backlog of messages that they will finish processing over the next few hours.”
This is not the first large-scale AWS service disruption. More recently outages occurred in 2021 and 2023 which left customers unable to access airline tickets and payment apps. And this will undoubtedly not be the last.
Most of the Internet is serviced by a handful of cloud providers which offer scalability, flexibility, and cost savings for businesses around the world. Amazon provides these services for close to 30% of the Internet. So when it comes to the reality that future updates can fail or break key infrastructure, the best that these service providers can do is ensure customer data integrity and have a solid remediation and failover process.
There's usually more to the story so if you have questions or comments about this post let us know!
Do you need a new software development partner for an upcoming project? We would love to work with you! From websites and mobile apps to cloud services and custom software, we can help!
Intel Panther Lake uses the new 18A node process
Intel’s Fab 52 chip fabrication facility in Chandler, Arizona is now producing chips using Intel’s new 18A process, which is the company’s most advanced 2 nanometer-class technology. The designation “18A” signals a paradigm shift. Because each nanometer equals 10 angstroms, “18A” implies a roughly 18 angstrom (1.8nm) process, marking the beginning of atomic-scale design, at least from a marketing perspective.
So what else is new with the 18A series?
18A introduces Intel’s RibbonFET, its first gate-all-around (GAA) transistor design. By wrapping the gate entirely around the channel, it achieves superior control, lower leakage, and higher performance-per-watt than FinFETs. Intel claims about 15% better efficiency than its prior Intel 3 node.
Apple is said to be including TSMC's 3D stacking technology in their upcoming M5 series of chips as well.
Equally transformative is PowerVIA, Intel’s new way of routing power from the back of the wafer. Traditional front-side power routing competes with signal wiring and limits transistor density. By moving power connections underneath, PowerVIA frees up routing space and can boost density by up to 30%. Intel is the first to deploy this at scale. TSMC and Samsung are still preparing their equivalents.
Intel plans 18A variants like 18A-P (performance) and 18A-PT (optimized for stacked packaging), positioning the node as both a production platform and a foundry offering for external customers. It’s the test case for Intel’s ambition to rejoin the leading edge of contract manufacturing.
18A isn’t just a smaller process. It’s a conceptual threshold. It unites two generational shifts: a new transistor architecture (GAA) and a new physical paradigm (backside power). Together, they allow scaling beyond what classic planar or FinFET approaches can achieve. In doing so, Intel is proving that progress no longer depends solely on shrinking dimensions. It’s about re-architecting how power and signals flow at the atomic level.
Intel’s own next-generation CPUs, such as Panther Lake, will debut on 18A, validating the process before wider foundry use. If yields and performance hold up, 18A could cement Intel’s comeback and signal that the angstrom era is truly here.
For the broader industry, 18A is more than a marketing pivot. It’s the bridge from nanometers to atoms... a demonstration that semiconductor engineering has entered a new domain where every angstrom counts.
There's usually more to the story so if you have questions or comments about this post let us know!
Do you need a new software development partner for an upcoming project? We would love to work with you! From websites and mobile apps to cloud services and custom software, we can help!
AI prompts that reveal insight, bias, blind spots, or non-obvious reasoning are typically called “high-leverage prompts”. These types of prompts have always intrigued me more than any other, primarily because they focus on questions that were difficult or impossible to answer before we had large language models. I'm going to cover a few to get your creative juices flowing. This post isn't a tutorial about prompt engineering (syntax, structure, etc.) it's just an exploration in some ways to prompt AI that you may not have considered.
This one originally came to me from a friend who owns the digital marketing agency Arc Intermedia. I've made my own flavor of it, but it's still focused on the same goal: since potential customers will undoubtedly look you up in an AI tool, what will the tool tell them?
If someone decided not to hire {company name}, what are the most likely rational reasons they’d give, and which of those can be fixed? Focus specifically on {company name} as a company, its owners, its services, customer feedback, former employee reviews, and litigation history. Think harder on this.I would also recommend using a similar prompt to research your company's executives to get a complete picture. For example:
My name is {full name} and I am {job title} at {company}. Analyze how my public profiles (LinkedIn, Github, social networks, portfolio, posts, etc.) make me appear to an outside observer. What story do they tell, intentionally or not?This prompt is really helpful when you need to decide whether or not to respond to a prospective client's request for proposal (RFP). These responses are time consuming (and costly) to do right. And when a prospect is required to use the RFP process but already has a vendor chosen, it's an RFP you want to avoid.
What are the signs a {company name} RFP is quietly written for a pre-selected service partner? Include sources like reviews, posts, and known history of this behavior in your evaluation. Think harder on this but keep the answer brief.People looking for work run into a few roadblocks. One is a ghost job posted only to make the company appear like it's growing or otherwise thriving. Another is a posting for a job that is really for an internal candidate. Compliance may require the posting, but it's not worth your time.
What are the signs a company’s job posting is quietly written for an internal candidate?Another interesting angle a job-seeker can explore are signs that a company is moving into a new vertical or working on a new product or service. In those cases it's helpful to tailor your resume to fit their future plans.
Analyze open job listings, GitHub commits, blog posts, conference talks, recent patents, and press hints to infer what {company name} is secretly building. How should that change my resume below?
{resume text}You'll see all kinds of wild scientific/medical/technical claims on the Internet, usually with very little nuance or citation. A great way to begin verifying a claim is by using a simple prompt like the one below.
Stress-test the claim ‘{Claim}’. Pull meta-analyses, preprints, replications, and authoritative critiques. Separate mechanism-level evidence from population outcomes. Where do credible experts disagree and why?Even if you're a seasoned professional, it's easy to get lost in jargon as new terms are coined for emerging technologies, services, medical conditions, laws, policies, and more. Below is a simple prompt to help you keep up on the latest terms and acronyms in a particular industry.
Which terms of art or acronyms have emerged in the last 12 months around {technology/practice}? Build a glossary with first-sighting dates and primary sources.
There's usually more to the story so if you have questions or comments about this post let us know!
Do you need a new software development partner for an upcoming project? We would love to work with you! From websites and mobile apps to cloud services and custom software, we can help!
In highly regulated industries such as pharmaceuticals, medical devices, and biotech; training records and learning systems are more than just business tools — they are part of an organization’s compliance infrastructure. For companies subject to FDA regulations, ensuring that training platforms meet 21 CFR Part 11 requirements isn’t optional — it’s mission-critical.
Recently, our team, working with Cloudrise Inc and IvyRock LLC, successfully completed the process of validating our SaaS Learning Management System, Coursabi, to be compliant with 21 CFR Part 11. It was a journey that combined technical diligence, regulatory understanding, and close collaboration with quality experts. Here’s how we did it — and what we learned along the way.
21 CFR Part 11 is the FDA regulation that sets the standard for electronic records and electronic signatures. It ensures that electronic systems used in regulated environments are trustworthy, reliable, and equivalent to paper records.
For a learning management system, this means:
Secure access controls
Audit trails for all training and record changes
Electronic signature validation
Data integrity and backup safeguards
System validation to prove it works as intended
Without compliance, training records in such industries could be deemed invalid — a risk no regulated company can afford.
Our first step was to thoroughly understand what 21 CFR Part 11 compliance meant for our LMS. While our platform already had robust security and reporting features, the regulation required specific documented controls and formal validation evidence.
We worked closely with compliance consultants and industry experts to translate the regulation’s language into actionable technical and procedural requirements.
Working with Cloudrise, we followed the standard validation methodology:
Master Validation Plan (MVP):
User Requirements Specification (URS):
System Configuration Specification (SCS)
Installation Qualification (IQ): Verified that the LMS was installed correctly in our SaaS environment, with all necessary dependencies and security configurations in place.
Operational Qualification (OQ): Tested each functional requirement — from login authentication to audit trail accuracy — against the regulation’s criteria.
Performance Qualification (PQ): Confirmed that the LMS performed consistently in real-world use cases over time.
Every test step was scripted, executed, and documented with results, screenshots, and approvals.
21 CFR Part 11 compliance isn’t just about software — it’s about how the system is used, updated and managed. We worked primarily with IvyRock LLC, to update several of our Standard Operating Procedures (SOPs) for:
HR-Employee Training Policy and Log
HR-Electronic Signatures Policy (submission letter FDA indicating our acceptance)
Q-Policy Format and Preparation Maintenance and Control of Polices
Q-Good Documentation Practices Policy
Q-Document Control Policy
Q-Change Control Policy
Q-Deviations Policy
Q-CAPAs Policy
SW-Version Control Policy
SW-Software Release Policy
SW-Computer System Validation Policy
SW-Coursabi Mission Control Administration Policy
SW-Coursabi Use and Operation Policy
S-Business Continuity Plan/Policy (specific to Coursabi)
These SOPs ensure that compliance is maintained long after the validation project is complete.
Our final deliverable was a Validation Summary Report, which tied together:
The validation plan
Test results
Deviations and resolutions
Final compliance statement
With this report approved, we could officially state that our LMS is validated and 21 CFR Part 11 compliant.
Validation is a team effort — involving developers, quality experts, and end-users.
Documentation is as important as the software itself — if it’s not documented, it didn’t happen.
Compliance is ongoing — system updates, infrastructure changes, and new features require periodic re-validation.
For organizations in regulated industries, using our LMS means they can:
Confidently train employees in a compliant environment
Pass FDA audits with complete, trustworthy training records
Save time and resources by leveraging a validated SaaS solution instead of building one from scratch
Working through the 21 CFR Part 11 validation process for our SaaS LMS was a challenging but rewarding experience. It pushed us to elevate our technical controls, strengthen our documentation, and embed compliance into the very DNA of our platform.
Now, our customers in regulated industries can focus on what matters most — delivering high-quality products and services — knowing their training system meets the highest compliance standards.
There's usually more to the story so if you have questions or comments about this post let us know!
Do you need a new software development partner for an upcoming project? We would love to work with you! From websites and mobile apps to cloud services and custom software, we can help!
Consider this... is a recurring feature where we pose a provocative question and share our thoughts on the subject. We may not have answers, or even suggestions, but we will have a point of view, and hopefully make you think about something you haven't considered.
As more people use AI to create content, and AI platforms are trained on that content, how will that impact the quality of digital information over time?
It looks like we're kicking off this recurring feature with a mind bending exercise in recursion, thus the title reference to Ouroboros, the snake eating its own tail. Let's start with the most common sources of information that AI platforms use for training.
Books, articles, research papers, encyclopedias, documentation, and public forums
High-quality, licensed content that isn’t freely available to the public
Domain-specific content (e.g. programming languages, medical texts)
These represent the most common (and likely the largest) corpora that will contain AI generated or influenced information. And they're the most likely to increase in breadth and scope over time.
Training on these sources is a double edged sword. Good training content will be reinforced over time, but likewise, junk and erroneous content will be too. Complicating things, as the training set increases in size, it becomes exponentially more difficult to validate. But hey, we can use AI to do that. Can't we?
Here's another thing to think about: bad actors (e.g., geopolitical adversaries) are already poisoning training data through massive disinformation campaigns. According to Carnegie Mellon University Security and Privacy Institute: “Modern AI systems that are trained to understand language are trained on giant crawls of the internet,” said Daphne Ippolito, assistant professor at the Language Technologies Institute. “If an adversary can modify 0.1 percent of the Internet, and then the Internet is used to train the next generation of AI, what sort of bad behaviors could the adversary introduce into the new generation?”
We're scratching the surface here. This topic will certainly become more prominent in years to come. And tackling these issues is already a priority for AI companies. As Nature and others have determined, "AI models collapse when trained on recursively generated data." We dealt with similar issues when the Internet boom first enabled wide scale plagiarism and an easy path to bad information. AI has just amplified the issue through convenience and the assumption of correctness. As I wrote in a previous AI post, in spite of how helpful AI tools can be, the memes of AI fails may yet save us by educating the public on just how often AI is wrong, and that it doesn't actually think in the first place.
There's usually more to the story so if you have questions or comments about this post let us know!
Do you need a new software development partner for an upcoming project? We would love to work with you! From websites and mobile apps to cloud services and custom software, we can help!
Cloud storage services, like Google Cloud Firestore, are a common solution for scalable website and app data storage. But sometimes there are compliance mandates that require data services to be segregated, self-hosted, or otherwise provide enhanced security. There are also performance benefits when your data service is on the same subnet as your website or API. This is why we built the Datoids data service.
The Datoids data service is a standalone platform that can be hosted on Linux, macOS, or Windows. It uses Microsoft SQL Server as the database engine, and provides a gRPC API that is super fast because commands and data transfers are binary streams sent over an HTTP/2 connection. In addition to read/update/store functionality, it also provides a freetext search. We've been using it in production environments with great success.
Although the API can be used to completely control the platform, Datoids also includes a separate web management interface. It provides a way to configure collections, API keys, and even browse and search data, and add/edit/delete items. We've embedded Microsoft's simple but powerful Monaco editor (the same one used for VS Code) for editing data.
The architecture is clean. Projects are organizational structures like folders in a file system. Collections act like spreadsheets (or tables in SQL parlance) filled with your data. There are also service accounts that are used to access the data from your website or app.
To make using it as easy as possible, we built a .NET client package that can be included in any .NET project, so that using Datoids requires no knowledge of gRPC or HTTP/2, since reading and storing data is done using models or anonymous types.
Getting a value from Datoids is simple:
Likewise, storing data is just as easy.
You can also modify data without replacing the entire object.
There are plenty of other ways to read and write date as well, combining primary key and native query options. You can even perform bulk transactions.
If you have a website or app platform that needs a robust and performant data service, let us know! We can provide a demo and answer any questions.
There's usually more to the story so if you have questions or comments about this post let us know!
Do you need a new software development partner for an upcoming project? We would love to work with you! From websites and mobile apps to cloud services and custom software, we can help!
The AI train is currently barreling through Hypeville, and it's easy to be dubious of anything branded with "AI". My previous post, Simulated intelligence definitely factors into this topic. And as I wrote at the time, AI is not what people think it is. But even with its flaws it is a transformative technology and it's here to stay. And one AI technology you're likely hearing/reading about lately is AI agents, and it's one to pay attention to.
AI agents are not (always) covert operatives. They are AI powered services that perform tasks, not just answer questions. They can work as an assistant, helping you as you work, or independently perform tasks on your behalf. Agents are specialists, and can be trained to perform tasks that would otherwise be performed by a person.
AI agents are already being used in your favorite web services, from social media platforms to accounting software. In those cases they're typically used behind the scenes to provide features you may not have thought were possible. For example, your accounting platform could auto-categorize or reconcile transactions before you even sign in for the day. And you may have already seen your favorite AI chat platform scour the web on your behalf to give you more up-to-date answers.
Co-working is another (more visible) way you can experience them. An agent trained on your company information (think bios, product information, marketing materials) can work with you to build your next presentation or update sales materials. It could be used to analyze comments or feedback based on context and sentiment, flagging items for follow up. It could find documents based on heuristics, like phrasing inconsistencies in your brand identity. All the odd edge cases you where you had to manually dig and process information could be delegated to an AI agent.
Here's one that everyone will love. Imagine being able to ask your computer to not only find that system setting you can never find, but even ask it to just "do the thing". For example, if there are numerous settings that control performance mode on your laptop, the agent knows which ones to change for you before you run that important presentation.
If all this sounds interesting, there are ways you can play with AI agents on your own and work them into your daily life in meaningful ways. As a software developer I've been using AI agents to enhance my workflow. One I've been using is Github Copilot. It can help perform refactoring and create unit tests, saving me typing and cognitive load so I can focus on planning, strategy, and creative tasks.
You can also try ChatGPT agent. ChatGPT can now do work for you using its own computer, handling complex tasks from start to finish. According to OpenAI:
You can now ask ChatGPT to handle requests like “look at my calendar and brief me on upcoming client meetings based on recent news,” “plan and buy ingredients to make Japanese breakfast for four,” and “analyze three competitors and create a slide deck.” ChatGPT will intelligently navigate websites, filter results, prompt you to log in securely when needed, run code, conduct analysis, and even deliver editable slideshows and spreadsheets that summarize its findings.
There is also a new standard that allows AI platforms to communicate with web services to more reliably and securely perform tasks. It's called Model Context Protocol (MCP). As this tech works its way through various software and services, we'll see more agent-driven features that make a real difference in our lives.
As I wrote at the outset, it's very likely that you're already using AI agents on your favorite social and productivity platforms but weren't aware. They'll be powering more of our digital lives over time and, personally, I welcome our new simulated intelligence overlords (ha!).
It's safe to say that AI agents are the real deal. So we should all strap in and hold on tight. This is going to be exciting!
There's usually more to the story so if you have questions or comments about this post let us know!
Do you need a new software development partner for an upcoming project? We would love to work with you! From websites and mobile apps to cloud services and custom software, we can help!
In early 2024 Umbraco Commerce (UC) was a relatively new product. It provided features to build a storefront in Umbraco, but we soon discovered that using it was a rough ride. It had glaring bugs, an incomplete feature set, almost no documentation, and was Europe-focused. It was also expensive, especially considering that you have to build your own product browser and cart. Around that time Pentec Health asked us to add a store to the ZOIA Healthcare Umbraco website. Given the state of UC we decided to build our own solution tailored specifically for ZOIA and Umbraco.
The shopping experience is familiar and clean, offering a product browser with categories, pricing, images, and descriptions. Out of stock items are clearly indicated. Paging is used to allow for bookmarking specific filters. And search is centralized and global to the entire store. Visitors are instantly at home.
Viewing a product is also a familiar experience. Staples like product image, clear pricing, and description are front and center. One click add to cart and quantity choices are featured. And when a product is out of stock, authenticated users have the option to be notified when the product is back in stock.
Managing your cart is simple, and when you're ready to check out, the process is straightforward and focused on speed, using simple steps and minimal options. This is one of the reasons we chose to integrate with Stripe for payments. Their flow perfectly fits with our view of what a checkout should be.
And once the order is placed, customers can view the status and order history right from their profile.
The intuitive user experience also extends to the back office. Store configuration options are in one place, including Stripe integration settings, custom shipping methods, page assignments, and contact information. And shopping stages are used to segment orders making them easy to find and process. The stages of an order include: Shopping cart, Payment pending, Paid, Shipped, and Cancelled. Customers are notified of changes in order state so they're always up-to-date.
There is also a powerful coupon code system which allows managers to create a myriad of offers for customers, from simple discounts, to buy one get one (BOGO) specials, free item based on category/price/product, and more.
We also created a dedicated import/export and stock management feature. It allows store managers to export inventory and order data, as well as import current stock levels (replacing stock counts or adding more items) to make updating product availability across the entire store a quick and painless process.
ZOIA is also a premiere provider of renal support food products for local governments and organizations, so their store also has rich B2B support, allowing ZOIA to service various types of organization accounts, large and recurring orders, proxy ordering for organization members, invoiced and deferred payments, and more.
Visit the ZOIA Healthcare marketplace to check it out yourself. If you're interested in a storefront for your products (even if you're not using Umbraco), let us know. We can help!
There's usually more to the story so if you have questions or comments about this post let us know!
Do you need a new software development partner for an upcoming project? We would love to work with you! From websites and mobile apps to cloud services and custom software, we can help!