<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Maurizio Morri Substack]]></title><description><![CDATA[AI, Biology, Nature]]></description><link>https://www.mauriziomorri.tech</link><generator>Substack</generator><lastBuildDate>Wed, 08 Apr 2026 15:29:16 GMT</lastBuildDate><atom:link href="https://www.mauriziomorri.tech/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Maurizio Morri]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[mauriziomorri@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[mauriziomorri@substack.com]]></itunes:email><itunes:name><![CDATA[Maurizio Morri]]></itunes:name></itunes:owner><itunes:author><![CDATA[Maurizio Morri]]></itunes:author><googleplay:owner><![CDATA[mauriziomorri@substack.com]]></googleplay:owner><googleplay:email><![CDATA[mauriziomorri@substack.com]]></googleplay:email><googleplay:author><![CDATA[Maurizio Morri]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[The birth of the integrated circuit]]></title><description><![CDATA[Today in the history of programming]]></description><link>https://www.mauriziomorri.tech/p/the-birth-of-the-integrated-circuit</link><guid isPermaLink="false">https://www.mauriziomorri.tech/p/the-birth-of-the-integrated-circuit</guid><dc:creator><![CDATA[Maurizio Morri]]></dc:creator><pubDate>Wed, 25 Mar 2026 00:02:08 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!94A5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb64901d-583e-4e70-9356-5674458ef0e7_144x144.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Programming history is usually told through languages, operating systems, and famous pieces of software. But some of the most important days in the history of programming are really hardware days, because they changed the scale at which software could exist. March 24, 1959 is one of those dates. It was the day Texas Instruments publicly demonstrated the first integrated circuit, based on Jack Kilby&#8217;s work. The Computer History Museum notes that this demonstration showed multiple electronic components on the same piece of semiconductor material, a step that would become foundational for modern computing.</p><p>At first glance, that may sound too far from programming to belong in a programming series. But the connection is actually direct. Before integrated circuits, computers depended on larger, more fragile, more power hungry assemblies of discrete components. That limited reliability, density, and cost. Once integration became possible, the trajectory of computing changed. Machines could become smaller, cheaper, and more powerful over time, which meant programmers were no longer writing only for rare institutional systems. They were gradually writing for platforms that could spread everywhere. The Computer History Museum also notes that while Kilby demonstrated the first integrated circuit, Robert Noyce helped make the idea commercially viable soon after, which is what turned a laboratory achievement into the basis of an industry.</p><p>This matters for programming history because software always inherits the shape of the machine underneath it. You cannot have mass market operating systems, personal computing, embedded software, mobile apps, cloud infrastructure, or modern AI accelerators without the density and scalability that integrated circuits made possible. The history of programming is not only the history of abstractions. It is also the history of what abstractions the hardware can afford to support.</p><p>There is a useful lesson here about how technical revolutions actually happen. The integrated circuit did not instantly create modern software culture in 1959. But it created the physical precondition for the long chain that followed. More transistors on smaller chips led to more capable computers. More capable computers led to richer operating systems, higher level languages, broader software markets, and eventually to the expectation that computation could be embedded in almost every part of life. Programming did not become central to society because syntax got better alone. It became central because the substrate underneath it kept shrinking, accelerating, and spreading.</p><p>That is why March 24 deserves a place in the history of programming. It marks one of the earliest visible points where computing stopped being only a large machine problem and started becoming a scalability problem. Once electronics could be integrated, software gained room to grow. Every later wave of programming, from Unix to C, from the personal computer to the web, from smartphones to machine learning, sits somewhere downstream of that shift.</p><p>So this entry is not about a language release or a famous algorithm. It is about the day the physical future of programming became more plausible. On March 24, 1959, the integrated circuit was demonstrated to the world, and from that point on, programming had a radically larger future to inhabit.</p><p>Sources</p><p>https://www.computerhistory.org/tdih/</p><p>https://www.computerhistory.org/timeline/software-languages/</p>]]></content:encoded></item><item><title><![CDATA[Small releases are still important releases]]></title><description><![CDATA[Today in the history of programming]]></description><link>https://www.mauriziomorri.tech/p/small-releases-are-still-important</link><guid isPermaLink="false">https://www.mauriziomorri.tech/p/small-releases-are-still-important</guid><dc:creator><![CDATA[Maurizio Morri]]></dc:creator><pubDate>Thu, 19 Mar 2026 22:08:22 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!94A5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb64901d-583e-4e70-9356-5674458ef0e7_144x144.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Programming history is often told through giant milestones. A language is invented. A paradigm appears. A famous version changes everything. But sometimes the more important shift is procedural. It changes how a language evolves, how developers adopt it, and how the ecosystem learns to live with change. March 19, 2019 is one of those dates in programming history because it was the general availability date of JDK 12.</p><p>On paper, Java 12 was not the kind of release that usually gets romantic treatment. It was not the birth of Java, and it was not an LTS release. But that is exactly why it matters. JDK 12 showed that Java had fully entered its new six month cadence, where the platform no longer had to wait for huge theatrical upgrades to move forward. OpenJDK lists March 19, 2019 as the general availability date for JDK 12, and Oracle described the release as part of the faster cadence designed to give developers quicker access to completed enhancements.</p><p>That procedural change had real technical consequences. For years, major language ecosystems often trained developers to think in long pauses and giant leaps. You waited. Features piled up. Migration became emotionally expensive. Then a major release arrived carrying too many changes at once. The six month Java model pushed in the opposite direction. Smaller releases meant less drama, more iteration, and a healthier rhythm between language design and production reality.</p><p>Java 12 itself also captured something important about how modern languages evolve. One of its headline items was the preview of switch expressions. That may sound modest, but it reflected a deeper trend in programming language design. Even a mature enterprise language like Java was still learning from decades of experience about expressiveness, clarity, and reducing accidental complexity. Turning switch from a statement into something that could behave more like an expression was not just syntax polish. It was part of the long movement toward safer, more readable code and fewer awkward control flow patterns.</p><p>This is why March 19, 2019 deserves a place in the history of programming. It marks a moment when Java stopped treating evolution as a rare event and started treating it as infrastructure. That is a bigger cultural change than it first appears. It means the language was no longer defined only by the giant headline release. It was defined by a disciplined process of continuous refinement.</p><p>That idea has spread far beyond Java. Today, many programming ecosystems live on predictable release trains, feature previews, staged adoption, and incremental modernization. In hindsight, that feels normal. But it represents a philosophical shift in software engineering. Languages are no longer monuments that are occasionally renovated. They are systems under active maintenance, always moving, always negotiating between stability and progress.</p><p>So this entry in the history of programming is not about a revolutionary new language. It is about a mature language proving that stability and change do not have to be enemies. On March 19, 2019, Java 12 made that visible. And for millions of developers working in long lived systems, that may have been one of the most important programming lessons of the decade.</p><p>Sources</p><p>https://openjdk.org/projects/jdk/12/</p><p>https://www.oracle.com/africa/corporate/pressrelease/latest-java-release-2019-03-19.html</p><p>https://mail.openjdk.org/pipermail/announce/2019-March/000265.html</p>]]></content:encoded></item><item><title><![CDATA[The Day Programming Became a Public Market Story]]></title><description><![CDATA[Today in the history of programming]]></description><link>https://www.mauriziomorri.tech/p/the-day-programming-became-a-public</link><guid isPermaLink="false">https://www.mauriziomorri.tech/p/the-day-programming-became-a-public</guid><dc:creator><![CDATA[Maurizio Morri]]></dc:creator><pubDate>Fri, 13 Mar 2026 21:30:26 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!94A5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb64901d-583e-4e70-9356-5674458ef0e7_144x144.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Programming history is often told through languages, operating systems, or famous machines. But some dates matter because they changed who got to participate in the future of software. March 13, 1986 is one of those dates. It was the day Microsoft went public.</p><p>At first glance, an IPO might seem more like financial history than programming history. But in practice, Microsoft&#8217;s public debut marked a turning point in the economics of software. It helped validate the idea that code itself could be the foundation of a massive independent business, not just an accessory sold with hardware. The Computer History Museum notes that Microsoft began trading on the NASDAQ on March 13, 1986, nearly eleven years after its founding, and raised almost $61 million in a single day. That was a powerful signal to the industry that software had become central, not secondary. Source: <a href="https://www.computerhistory.org/tdih/march/13/?utm_source=chatgpt.com">https://www.computerhistory.org/tdih/march/13/</a></p><p>That shift mattered deeply for programmers. In the earlier decades of computing, hardware companies often dominated the conversation. Software was important, but it was frequently bundled, constrained, or treated as part of a larger machine business. Microsoft helped push a different model into the mainstream. The value was increasingly in the operating system, the development tools, the office software, the platform, and the ecosystem around them. In other words, the real strategic asset was the layer programmers built.</p><p>This public market moment also changed the cultural image of programming. After Microsoft&#8217;s rise, it became much easier to imagine that writing software was not just technical work but industrial power. The programmer was no longer only a specialist serving institutions or hardware vendors. The programmer became a builder of products that could scale globally. That idea now feels obvious, but in 1986 it was still becoming real.</p><p>It is also worth remembering what Microsoft represented at that time. This was not yet the giant that would later define desktop computing for an entire generation. It was a company that had grown by betting that software platforms would matter enormously. That bet turned out to be correct. The IPO did not create Microsoft&#8217;s influence on its own, but it amplified it, gave it capital, legitimacy, and visibility, and helped accelerate the broader software economy that followed.</p><p>From the perspective of programming history, March 13 stands as a reminder that code does not shape the world only through technical elegance. It also shapes the world through institutions, markets, and distribution. A language can be brilliant, an operating system can be well designed, and a development tool can save millions of hours, but scale often arrives when the surrounding business structure allows software to spread everywhere. Microsoft&#8217;s IPO was one of the clearest moments when that structure locked into place.</p><p>There is another reason this date still matters. Today, when we talk about AI platforms, developer ecosystems, open source monetization, cloud infrastructure, and software companies becoming geopolitical actors, we are living inside the long aftermath of that transition. The software company as a dominant force in the economy did not begin on a single day, but March 13, 1986 is one of the cleanest symbols of its arrival.</p><p>So this entry in the history of programming is not about a new syntax, a compiler breakthrough, or a famous algorithm. It is about the moment the market publicly recognized that programming had moved to the center of modern industry. That recognition changed the fate of companies, developers, and the entire software landscape.</p><p><strong>Sources</strong></p><p>Computer History Museum, &#8220;What Happened on March 13th&#8221;<br><a href="https://www.computerhistory.org/tdih/march/13/?utm_source=chatgpt.com">https://www.computerhistory.org/tdih/march/13/</a></p><p>Computer History Museum, &#8220;What Happened Today, March 13th&#8221;<br><a href="https://www.computerhistory.org/tdih/?utm_source=chatgpt.com">https://www.computerhistory.org/tdih/</a></p>]]></content:encoded></item><item><title><![CDATA[The day the Web Became a Programming Platform]]></title><description><![CDATA[Today in the history of programming]]></description><link>https://www.mauriziomorri.tech/p/the-day-the-web-became-a-programming</link><guid isPermaLink="false">https://www.mauriziomorri.tech/p/the-day-the-web-became-a-programming</guid><dc:creator><![CDATA[Maurizio Morri]]></dc:creator><pubDate>Tue, 10 Mar 2026 20:40:11 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!94A5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb64901d-583e-4e70-9356-5674458ef0e7_144x144.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>On March 10, 1997, Netscape announced what it called a third generation of its web browser software, aimed directly at Microsoft and with an explicit focus on extranets, meaning private internet style networks connecting companies. That single line sounds quaint today, but it captures a real inflection point. The browser was no longer just a document viewer. It was being positioned as enterprise infrastructure, and that reframed the browser as a programmable surface where scripting, layout, and security decisions would decide winners. <a href="https://www.computerhistory.org/tdih/march/10/?utm_source=chatgpt.com">https://www.computerhistory.org/tdih/march/10/</a></p><p>This announcement sits right at the peak tension of the first browser war. Netscape had dominated earlier, then Microsoft began using Windows distribution to push Internet Explorer. The competition forced both sides to ship features fast, and developers lived inside the fallout: incompatible behavior, rapidly changing APIs, and a new expectation that &#8220;web development&#8221; meant real programming, not just writing markup. The later US antitrust case and its findings make clear how central browser bundling and browser share had become to platform control. <a href="https://www.justice.gov/atr/us-v-microsoft-courts-findings-fact?utm_source=chatgpt.com">https://www.justice.gov/atr/us-v-microsoft-courts-findings-fact</a></p><p>What Netscape was pointing toward on March 10 became more concrete a few months later with Netscape Communicator 4, released in June 1997 as an internet suite that bundled the browser with email, an HTML editor, and collaboration tooling that clearly targeted business use. That packaging matters for programming history because it reflects a shift in mental model. The browser stopped being a single app and started being an application runtime plus the default user interface for networked work. <a href="https://www.webdesignmuseum.org/software/netscape-communicator-4-01-in-1997?utm_source=chatgpt.com">https://www.webdesignmuseum.org/software/netscape-communicator-4-01-in-1997</a></p><p>At the standards layer, 1997 is also when the industry was trying to stabilize the web as a platform, not a moving target defined by two vendors. W3C&#8217;s work on HTML 4.0, including public drafts in mid 1997, is part of the same story: browsers were innovating aggressively, but the long term value came from making the platform predictable enough that programmers could build on it without rewriting everything every release. <a href="https://www.w3.org/press-releases/1997/html40-draft/?utm_source=chatgpt.com">https://www.w3.org/press-releases/1997/html40-draft/</a></p><p>So March 10, 1997 is worth remembering not because a specific version shipped that day, but because it captures the browser&#8217;s transition into a strategic programming platform. Extranets were the enterprise justification, but the deeper outcome was that the browser became where software got delivered, where languages like JavaScript earned their place, and where the modern idea of shipping applications over a network became normal.</p><p>Sources: <a href="https://www.computerhistory.org/tdih/march/10/?utm_source=chatgpt.com">https://www.computerhistory.org/tdih/march/10/</a> <a href="https://www.webdesignmuseum.org/software/netscape-communicator-4-01-in-1997?utm_source=chatgpt.com">https://www.webdesignmuseum.org/software/netscape-communicator-4-01-in-1997</a> <a href="https://www.justice.gov/atr/us-v-microsoft-courts-findings-fact?utm_source=chatgpt.com">https://www.justice.gov/atr/us-v-microsoft-courts-findings-fact</a> <a href="https://www.w3.org/press-releases/1997/html40-draft/?utm_source=chatgpt.com">https://www.w3.org/press-releases/1997/html40-draft/</a></p>]]></content:encoded></item><item><title><![CDATA[The Homebrew Computer Club]]></title><description><![CDATA[Today in the history of programming]]></description><link>https://www.mauriziomorri.tech/p/the-homebrew-computer-club</link><guid isPermaLink="false">https://www.mauriziomorri.tech/p/the-homebrew-computer-club</guid><dc:creator><![CDATA[Maurizio Morri]]></dc:creator><pubDate>Wed, 04 Mar 2026 00:14:52 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!94A5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb64901d-583e-4e70-9356-5674458ef0e7_144x144.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>March 3, 1975: The Homebrew Computer Club and the Day Programming Went Personal</p><p>On March 3, 1975, the Homebrew Computer Club held its first meeting in a garage in Menlo Park, California. Founders Fred Moore and Gordon French hosted about 30 microcomputer hobbyists, and the Altair was a central topic because it was a computer you could build at home from a kit. (<a href="https://www.computerhistory.org/tdih/march/3/">CHM</a>)</p><p>This date matters for programming because it marks a shift in who got to be a programmer. Before the mid 1970s, programming was strongly associated with institutions, mainframes, and controlled access. The Homebrew scene helped flip that model. When a computer becomes something you can assemble and experiment with at home, programming stops being an internal skill of organizations and becomes a personal craft, learned socially, shared in person, and improved through iteration.</p><p>The Altair focus is also important because it is basically the prototype of the modern developer platform story. A constrained machine appears, a community forms around it, and the community builds tools, languages, and workflows that make the machine useful. That pattern keeps repeating, from early microcomputers to personal computing, to smartphones, to today&#8217;s GPU and AI developer stacks. The hardware changes, but the social mechanism is the same: a small group of obsessed builders turns a device into an ecosystem.</p><p>The Computer History Museum&#8217;s summary makes a simple claim that holds up historically: the club and similar groups contributed to the growing popularity of the personal computer. (<a href="https://www.computerhistory.org/tdih/march/3/">CHM</a>) In practice, that popularity was powered by code. People did not fall in love with kits. They fell in love with what they could make them do, and that is why this garage meeting belongs in programming history. It is one of the moments where programming moved closer to everyday life and stayed there.</p><p><a href="https://www.computerhistory.org/tdih/march/3/">https://www.computerhistory.org/tdih/march/3/</a></p>]]></content:encoded></item><item><title><![CDATA[When Supercomputing Became a Benchmark Dispute]]></title><description><![CDATA[Today in the history of programming]]></description><link>https://www.mauriziomorri.tech/p/when-supercomputing-became-a-benchmark</link><guid isPermaLink="false">https://www.mauriziomorri.tech/p/when-supercomputing-became-a-benchmark</guid><dc:creator><![CDATA[Maurizio Morri]]></dc:creator><pubDate>Tue, 03 Mar 2026 00:10:57 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!T_xv!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd45bb226-b13c-48ca-b89c-ba4eb542a0c5_1068x796.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>On March 2, 1993, the Computer History Museum records a moment that reads like a template for today&#8217;s AI hardware race. The New York Times reported that Japan&#8217;s National Institute for Fusion Science announced a Japanese supercomputer designed by NEC could perform the tasks the institute required, and Cray Research insisted on testing the machine itself before accepting the claim. https://www.computerhistory.org/tdih/march/2/</p><p>That small detail, insisted on testing, is the whole story. In high performance computing, performance is never just a number, because the number depends on what you run, how you run it, what you count, and what you quietly ignore. The moment a procurement decision is on the line, benchmarks stop being technical trivia and become the arena where engineering, credibility, and national ambition collide.</p><p>If that feels familiar, it should. AI infrastructure is repeating the same cycle, only faster and louder. Vendors can show dramatic tokens per second or training throughput, but operators care about what happens under real workloads, real latency targets, real memory constraints, and real power limits. That is one reason MLCommons has invested so heavily in MLPerf as a standardized benchmark suite for training and inference, an attempt to make competing systems comparable without letting everyone pick their own favorite test. https://mlcommons.org/benchmarks/</p><p></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!T_xv!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd45bb226-b13c-48ca-b89c-ba4eb542a0c5_1068x796.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!T_xv!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd45bb226-b13c-48ca-b89c-ba4eb542a0c5_1068x796.png 424w, https://substackcdn.com/image/fetch/$s_!T_xv!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd45bb226-b13c-48ca-b89c-ba4eb542a0c5_1068x796.png 848w, https://substackcdn.com/image/fetch/$s_!T_xv!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd45bb226-b13c-48ca-b89c-ba4eb542a0c5_1068x796.png 1272w, https://substackcdn.com/image/fetch/$s_!T_xv!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd45bb226-b13c-48ca-b89c-ba4eb542a0c5_1068x796.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!T_xv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd45bb226-b13c-48ca-b89c-ba4eb542a0c5_1068x796.png" width="1068" height="796" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d45bb226-b13c-48ca-b89c-ba4eb542a0c5_1068x796.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:796,&quot;width&quot;:1068,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1202399,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.mauriziomorri.tech/i/189714897?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd45bb226-b13c-48ca-b89c-ba4eb542a0c5_1068x796.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!T_xv!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd45bb226-b13c-48ca-b89c-ba4eb542a0c5_1068x796.png 424w, https://substackcdn.com/image/fetch/$s_!T_xv!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd45bb226-b13c-48ca-b89c-ba4eb542a0c5_1068x796.png 848w, https://substackcdn.com/image/fetch/$s_!T_xv!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd45bb226-b13c-48ca-b89c-ba4eb542a0c5_1068x796.png 1272w, https://substackcdn.com/image/fetch/$s_!T_xv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd45bb226-b13c-48ca-b89c-ba4eb542a0c5_1068x796.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p>Even within MLPerf, the story is not just raw speed. MLCommons explicitly frames the space as exploding in variety, with many organizations building inference chips and systems spanning huge ranges of power and performance. https://mlcommons.org/benchmarks/inference-datacenter/ Reuters covered this dynamic too, highlighting benchmark tests aimed at response speed for AI applications and the parallel importance of power efficiency. https://www.reuters.com/technology/new-ai-benchmark-tests-speed-responses-user-queries-2024-03-27/</p><p>The deeper programming lesson from March 2 is that performance disputes are rarely about a single machine. They are about definitions. What is the workload. What is the acceptance criterion. What is the allowed tuning. What counts as fair. In 1993 it was supercomputers and national labs. In 2026 it is GPUs, inference clusters, and model serving stacks. The arguments rhyme because the constraint is the same: when compute becomes strategic, measurement becomes political, and programmers end up writing the reality behind the claims.</p>]]></content:encoded></item><item><title><![CDATA[Cray Research most important acquisition]]></title><description><![CDATA[Today in the history of programming]]></description><link>https://www.mauriziomorri.tech/p/cray-research-most-important-acquisition</link><guid isPermaLink="false">https://www.mauriziomorri.tech/p/cray-research-most-important-acquisition</guid><dc:creator><![CDATA[Maurizio Morri]]></dc:creator><pubDate>Thu, 26 Feb 2026 22:53:41 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!94A5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb64901d-583e-4e70-9356-5674458ef0e7_144x144.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Kvwh!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F151256f2-c156-4684-ab6e-64524253113a_600x277.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Kvwh!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F151256f2-c156-4684-ab6e-64524253113a_600x277.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Kvwh!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F151256f2-c156-4684-ab6e-64524253113a_600x277.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Kvwh!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F151256f2-c156-4684-ab6e-64524253113a_600x277.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Kvwh!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F151256f2-c156-4684-ab6e-64524253113a_600x277.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Kvwh!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F151256f2-c156-4684-ab6e-64524253113a_600x277.jpeg" width="600" height="277" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/151256f2-c156-4684-ab6e-64524253113a_600x277.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:277,&quot;width&quot;:600,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Cray Research X MP badge&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Cray Research X MP badge" title="Cray Research X MP badge" srcset="https://substackcdn.com/image/fetch/$s_!Kvwh!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F151256f2-c156-4684-ab6e-64524253113a_600x277.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Kvwh!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F151256f2-c156-4684-ab6e-64524253113a_600x277.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Kvwh!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F151256f2-c156-4684-ab6e-64524253113a_600x277.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Kvwh!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F151256f2-c156-4684-ab6e-64524253113a_600x277.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>On February 26, 1996, Silicon Graphics Inc. bought Cray Research for $767 million, a deal that temporarily made SGI the leading supplier of high speed computing machines in the United States. (<a href="https://www.computerhistory.org/tdih/february/26/">CHM</a>) The headline sounds like business history, but the real story is what it did to programming culture. It put Hollywood class visualization, MIPS and IRIX engineering, and the Cray tradition of extreme performance into the same corporate roof. (<a href="https://www.computerhistory.org/tdih/february/26/">CHM</a>)</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Maurizio Morri Substack is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Cray represented the idea that hardware design could pull science forward by making the impossible computable, from weather modeling to defense simulations. (<a href="https://www.computerhistory.org/tdih/february/26/">CHM</a>) SGI represented the idea that interactive computing and graphics pipelines could change how humans work with complex systems. The combined signal to programmers was blunt: performance would no longer be an exotic concern reserved for national labs. It would become a product feature.</p><p>That forced a shift in what &#8220;good code&#8221; meant. Correctness was no longer enough. You needed a mental model of memory hierarchy, cache behavior, vectorization, parallel decomposition, and the cost of communication. The era that followed popularized programming techniques that are still the backbone of modern AI and scientific computing: you structure data so it streams, you batch work so overhead amortizes, you minimize synchronization, and you design algorithms around bandwidth.</p><p>This date also sits near a deeper historical arc that is easy to miss. Supercomputing has always been a negotiation between what scientists want to ask and what machines can answer. When SGI and Cray collided, that negotiation moved closer to mainstream software engineering. Over time, the lesson generalized: programming languages and compilers matter, but performance lives in systems. Runtimes, schedulers, libraries, interconnects, and profiling tools become part of the programming model whether you like it or not.</p><p>If you write high performance code today, especially for GPUs, distributed training, large scale simulation, or long context inference, you are still living in the world that deals like this helped normalize. The machines changed. The constraints stayed.</p><p><a href="https://www.computerhistory.org/tdih/february/26/">https://www.computerhistory.org/tdih/february/26/</a></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Maurizio Morri Substack is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The birth of Automatically Programmed Tools]]></title><description><![CDATA[Today in the history of programming]]></description><link>https://www.mauriziomorri.tech/p/the-birth-of-automatically-programmed</link><guid isPermaLink="false">https://www.mauriziomorri.tech/p/the-birth-of-automatically-programmed</guid><dc:creator><![CDATA[Maurizio Morri]]></dc:creator><pubDate>Thu, 26 Feb 2026 00:41:38 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!94A5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb64901d-583e-4e70-9356-5674458ef0e7_144x144.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>In the popular history of programming, the story usually jumps from early business languages to operating systems and then straight to the web. February 25, 1959 is a reminder that one of the most important programming revolutions happened somewhere else: on the factory floor. On this day, MIT demonstrated APT, Automatically Programmed Tools, an English like programming language designed to generate instructions for numerically controlled machine tools.</p><p>That sounds niche until you notice what it really represents. APT turned geometry into code and code into motion. It was a language whose output was not a printed report or a number in memory, but a cutting path through metal. The famous anecdote from the time is almost too perfect: the Computer History Museum notes that a New Yorker article described the Air Force announcing a machine that could receive instructions in English, figure out how to make what was wanted, teach other machines how to make it, and that day it made an ashtray.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Maurizio Morri Substack is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>APT mattered because it created a new pattern that is now everywhere: a domain specific language that compiles high level intent into low level control. Today we take this pattern for granted in compilers, shader languages, SQL, and modern ML graph compilers. In 1959, APT was doing it with parts, tools, and tolerances. It also helped cement the idea that programming is not only about computing results, it is also about controlling physical processes.</p><p>You can draw a straight line from APT to modern CAD CAM workflows. The human specifies geometry and constraints, the system computes a toolpath, and a postprocessor emits machine specific commands. That layering is basically a compiler toolchain, except the target is a machine controller and the bugs can snap tools, scrap parts, or worse. This is one reason manufacturing software has always been a high stakes branch of programming: the runtime is reality.</p><p>If you want a clean lesson for today, it is that some of the most consequential programming history is not about screens, apps, or networks. It is about translation. APT was an early proof that code could be a bridge between human intent and the physical world, and that bridge would go on to shape automation, robotics, aerospace manufacturing, and the entire idea of software defined production.</p><p><a href="https://www.computerhistory.org/tdih/february/25/?utm_source=chatgpt.com">https://www.computerhistory.org/tdih/february/25/</a> <br><a href="https://www.acm.org/education/otd-in-computing-history?utm_source=chatgpt.com">https://www.acm.org/education/otd-in-computing-history</a></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Maurizio Morri Substack is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The Long Arc From NeXTSTEP to Agentic Coding]]></title><description><![CDATA[Today in the history of programming]]></description><link>https://www.mauriziomorri.tech/p/the-long-arc-from-nextstep-to-agentic</link><guid isPermaLink="false">https://www.mauriziomorri.tech/p/the-long-arc-from-nextstep-to-agentic</guid><dc:creator><![CDATA[Maurizio Morri]]></dc:creator><pubDate>Tue, 24 Feb 2026 23:19:25 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!94A5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb64901d-583e-4e70-9356-5674458ef0e7_144x144.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>February 24 is a clean date for programming history because it marks the birth of Steve Jobs in 1955, and his influence on programming was never just hardware. It was the idea that developer experience is a product, and that the tools, the frameworks, and the interface can change what kinds of software get built. The Computer History Museum highlights Jobs as instrumental in the Macintosh era and in the NeXT chapter that later fed back into Apple&#8217;s modern platform story.</p><p>The technical pivot that matters for programmers is NeXTSTEP. NeXTSTEP was not only an operating system. It was a programming environment built around Objective C, AppKit, and Interface Builder, designed to compress the distance between an idea and a working application. The Computer History Museum has a strong write up on how that object oriented stack made Jobs claim order of magnitude productivity improvements, a direct challenge to the pessimism popularized by The Mythical Man Month.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Maurizio Morri Substack is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>NeXTSTEP also became more than a historical curiosity because Apple bought NeXT in 1996, then merged that technology path into what became Mac OS X and later the platform lineage that underpins Apple&#8217;s modern operating systems. That is why so much Apple development culture, from Cocoa to tooling expectations, traces back to NeXT rather than classic Mac OS.</p><p>Fast forward and you can see the same thesis repeating with new primitives. Apple introduced Xcode as an integrated environment intended to make building on Mac OS X faster and more coherent, bundling editing, build, debug, and performance tooling into one place. The point is not the specific feature list from 2003. The point is that Apple kept treating tooling as leverage, not as an afterthought.</p><p>This week&#8217;s twist is that the tooling itself is starting to act like a collaborator. Recent reporting says Apple is integrating coding agents into Xcode, including support for systems that can do more than autocomplete, reaching into actions like project configuration and navigation inside the IDE. If that direction holds, it is a continuation of the same historical line: NeXTSTEP tried to collapse app creation time with frameworks and visual tooling, and modern Xcode is trying to collapse it again with agents that can execute multi step changes.</p><p>So today&#8217;s programming history lesson is not nostalgia. It is a reminder that the biggest platform shifts are often shifts in how programmers work. When frameworks become more expressive and tools become more powerful, entirely new categories of software become economical to build. February 24 is a good day to notice that the arc from NeXTSTEP to today&#8217;s agentic IDEs is not accidental. It is one long bet that programming productivity is a design problem.</p><p><a href="https://www.computerhistory.org/tdih/february/24/?utm_source=chatgpt.com">https://www.computerhistory.org/tdih/february/24/</a> <br><a href="https://computerhistory.org/blog/the-deep-history-of-your-apps-steve-jobs-nextstep-and-early-object-oriented-programming/?utm_source=chatgpt.com">https://computerhistory.org/blog/the-deep-history-of-your-apps-steve-jobs-nextstep-and-early-object-oriented-programming/</a> <br><a href="https://en.wikipedia.org/wiki/NeXTSTEP?utm_source=chatgpt.com">https://en.wikipedia.org/wiki/NeXTSTEP</a> <br><a href="https://www.apple.com/newsroom/2003/06/23Apple-Introduces-Xcode-the-Fastest-Way-to-Create-Mac-OS-X-Applications/?utm_source=chatgpt.com">https://www.apple.com/newsroom/2003/06/23Apple-Introduces-Xcode-the-Fastest-Way-to-Create-Mac-OS-X-Applications/</a> <br><a href="https://www.theverge.com/news/873300/apple-xcode-openai-anthropic-ai-agentic-coding?utm_source=chatgpt.com">https://www.theverge.com/news/873300/apple-xcode-openai-anthropic-ai-agentic-coding</a> <br><a href="https://www.techradar.com/pro/apple-launches-xcode-26-3-brings-even-more-ai-power-to-coding-on-mac?utm_source=chatgpt.com">https://www.techradar.com/pro/apple-launches-xcode-26-3-brings-even-more-ai-power-to-coding-on-mac</a></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Maurizio Morri Substack is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The birth of computational prime hunting]]></title><description><![CDATA[Today in the history of programming]]></description><link>https://www.mauriziomorri.tech/p/the-birth-of-computational-prime</link><guid isPermaLink="false">https://www.mauriziomorri.tech/p/the-birth-of-computational-prime</guid><dc:creator><![CDATA[Maurizio Morri]]></dc:creator><pubDate>Tue, 24 Feb 2026 02:45:15 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!94A5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb64901d-583e-4e70-9356-5674458ef0e7_144x144.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>February 23 matters in the history of programming because it marks the birth of Derrick Henry Lehmer in Berkeley, California in 1905, a figure who helped turn number theory into something you could run like an experiment. Before modern high level languages, Lehmer&#8217;s work pushed computation forward by treating prime search and factoring as engineering problems, building electromechanical and mechanical &#8220;sieves&#8221; that embodied algorithms in hardware.</p><p>The Lehmer sieve is one of the cleanest reminders that programming is older than software. A sieve is an algorithmic idea, and the Lehmers implemented it as a physical machine that could rapidly test congruences and filter candidates, essentially hard wiring a search procedure into moving parts, switches, and sensors. This is not a museum curiosity. It is an early example of what we still do today when we move hot loops into specialized kernels, vector instructions, GPUs, or custom accelerators. The medium changes. The impulse stays the same: take a mathematical structure and make it execute faster by respecting the constraints of the machine.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Maurizio Morri Substack is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Lehmer also sits directly on the line that connects prime hunting to modern computing practice. Prime tests and fast modular arithmetic are not just pure math, they are the backbone of widely used cryptographic systems, and the Computer History Museum explicitly frames prime numbers as important to cryptography in its February 23 entry on Lehmer. When you write software today that depends on secure connections, signatures, or authenticated updates, you are relying on descendants of the same computational number theory mindset: you need fast, reliable procedures for working with large integers and for establishing properties like primality under tight performance budgets.</p><p>If you want a technical moral from today&#8217;s date, it is that a surprising amount of programming history is really the history of making math executable. Lehmer&#8217;s era did it with electromechanical ingenuity. Ours does it with libraries, constant time implementations, hardware acceleration, and careful engineering around side channels. The continuity is the important part. Programming has always been about bridging abstract rules to real machines, and February 23 is a good day to remember one of the people who helped make that bridge concrete.</p><p><a href="https://www.computerhistory.org/tdih/february/23/?utm_source=chatgpt.com">https://www.computerhistory.org/tdih/february/23/</a></p><p><a href="https://mathshistory.st-andrews.ac.uk/Biographies/Lehmer_Derrick/?utm_source=chatgpt.com">https://mathshistory.st-andrews.ac.uk/Biographies/Lehmer_Derrick/</a></p><p><a href="https://en.wikipedia.org/wiki/Lehmer_sieve?utm_source=chatgpt.com">https://en.wikipedia.org/wiki/Lehmer_sieve</a></p><p><a href="https://ed-thelen.org/comp-hist/Mike-Williams-Lehmer.html?utm_source=chatgpt.com">https://ed-thelen.org/comp-hist/Mike-Williams-Lehmer.html</a></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Maurizio Morri Substack is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[When Code Started Compiling Itself]]></title><description><![CDATA[Today in the history of programming]]></description><link>https://www.mauriziomorri.tech/p/when-code-started-compiling-itself</link><guid isPermaLink="false">https://www.mauriziomorri.tech/p/when-code-started-compiling-itself</guid><dc:creator><![CDATA[Maurizio Morri]]></dc:creator><pubDate>Sat, 21 Feb 2026 20:12:02 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!94A5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb64901d-583e-4e70-9356-5674458ef0e7_144x144.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>On February 21, 2010, CoffeeScript crossed a line that programmers quietly recognize as a rite of passage. Its compiler stopped being written in Ruby and became self hosted, meaning CoffeeScript was now able to compile itself. That move is not just a flex. It is a technical milestone because it forces a language implementation to live inside the constraints and expressiveness of the language it defines, and it usually triggers a wave of cleanup in semantics, tooling, and bootstrapping discipline.</p><p>Self hosting is also one of those inflection points that changes community behavior. Before self hosting, a language can feel like a thin layer that depends on a parent ecosystem for survival. After self hosting, the language starts to feel like an ecosystem of its own, because the compiler and the language evolve together, and contributors can work in the target language instead of the host language. Even if CoffeeScript&#8217;s cultural peak has passed, this moment is still a clean historical example of how language projects mature.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Maurizio Morri Substack is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Nine years later, February 21 shows up again in a different corner of programming, this time in the AI era. On February 21, 2019, the Linux Foundation announced that Pyro, a probabilistic programming language built on PyTorch, became a project in the LF Deep Learning Foundation. That matters because probabilistic programming sits at the intersection of software engineering and statistical reasoning, and institutional support is one of the signals that a tool has moved from an interesting lab artifact to something the broader ecosystem intends to sustain.</p><p>Put those two dates together and you get a neat snapshot of programming&#8217;s evolution. In 2010, a key milestone was a language proving it could stand on its own by compiling itself. In 2019, a key milestone was a language proving it could scale in a community and governance sense by joining a foundation. One is about technical self reliance. The other is about social infrastructure for technical work. Both are necessary for software that survives.</p><p>https://en.wikipedia.org/wiki/CoffeeScript</p><p>https://www.linuxfoundation.org/press/press-release/pyro-probabilistic-programming-language-becomes-newest-lf-deep-learning-project</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Maurizio Morri Substack is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Happy Birthday Python]]></title><description><![CDATA[A part of the series today in programming history]]></description><link>https://www.mauriziomorri.tech/p/happy-birthday-python</link><guid isPermaLink="false">https://www.mauriziomorri.tech/p/happy-birthday-python</guid><dc:creator><![CDATA[Maurizio Morri]]></dc:creator><pubDate>Fri, 20 Feb 2026 22:05:08 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!94A5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb64901d-583e-4e70-9356-5674458ef0e7_144x144.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1>February 20, 1991: A Hobby Project Called Python</h1><p><strong>35 years ago today, Guido van Rossum posted his side project to the internet. It now runs the world.</strong></p><div><hr></div><p>It was a Wednesday in February. At the Centrum Wiskunde &amp; Informatica in Amsterdam, a 35-year-old Dutch programmer named Guido van Rossum uploaded a compressed archive to alt.sources, a Usenet newsgroup where people shared source code the way you might toss a paperback into a communal free library. The file contained version 0.9.0 of a programming language he&#8217;d been tinkering with during the Christmas break of 1989 &#8212; a language he&#8217;d named after Monty Python&#8217;s Flying Circus, not the snake.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Maurizio Morri Substack is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>It was not a big deal. There was no press release. No product launch. No venture funding. Just a guy who thought programming could be more pleasant.</p><h2>The itch Guido was scratching</h2><p>Van Rossum had spent years working on ABC, a teaching language developed at CWI. ABC was elegant in theory but frustrating in practice &#8212; it was rigid, closed, and difficult to extend. He loved its clean syntax and readability but hated that it couldn&#8217;t talk to the outside world. You couldn&#8217;t write a script to rename files. You couldn&#8217;t glue two programs together. ABC was a beautiful room with no doors.</p><p>Python was Van Rossum&#8217;s answer: keep ABC&#8217;s readability, but make the language practical, extensible, and open. Even that first 0.9.0 release came with features that still define Python today &#8212; classes with inheritance, exception handling, core data types like lists and dictionaries, and a module system borrowed from Modula-3. The bones were there from day one.</p><p>He later described his design philosophy in a way that has become almost a mantra: code is read more often than it is written. Every choice Python made &#8212; significant whitespace, minimal syntax, explicit over implicit &#8212; flowed from that single insight.</p><h2>The long, quiet rise</h2><p>Python did not take off immediately. Through the 1990s, it was a niche tool, beloved by system administrators and scientists but unknown to most programmers. Perl dominated scripting. Java dominated enterprise. C++ dominated everything else.</p><p>But Python kept growing. Version 1.0 arrived in January 1994, adding functional programming tools like <code>lambda</code>, <code>map</code>, and <code>filter</code>. Van Rossum took on the tongue-in-cheek title &#8220;Benevolent Dictator for Life&#8221; &#8212; a joke that stuck for 27 years.</p><p>Python 2.0 shipped in October 2000 with list comprehensions and garbage collection. Then the language entered the long, painful Python 2 versus Python 3 split &#8212; a decade-long migration that tested the community&#8217;s patience but ultimately proved that sometimes you have to break things to fix them.</p><p>And then something happened that nobody predicted. Machine learning exploded, and the researchers reaching for a programming language overwhelmingly reached for Python. NumPy, pandas, scikit-learn, TensorFlow, PyTorch &#8212; the entire modern AI stack was built on Python&#8217;s foundations. A language designed to be pleasant for humans turned out to be the perfect interface for teaching machines.</p><h2>The coincidence</h2><p>Here&#8217;s a small, delightful fact. Python 3.2.0 was released on February 20, 2011 &#8212; exactly 20 years to the day after 0.9.0. Guido van Rossum himself noted the coincidence on Twitter: <em>&#8220;The first Python version, 0.9.0, was released on Feb 20, 1991. Python 3.2.0 was released exactly 20 years later, on Feb 20, 2011.&#8221;</em></p><p>Today, 35 years after that quiet Usenet post, Python is the most popular programming language on the planet by most measures. It is used to train AI models, analyze genomes, build web applications, automate infrastructure, teach introductory computer science, and write the scripts that hold together roughly half of the internet&#8217;s plumbing. Stack Overflow&#8217;s annual survey has placed it at or near the top for years running. GitHub shows it as the most-used language on the platform.</p><p>All of this from a Christmas holiday project by a programmer who just thought coding should feel nicer.</p><h2>What we can learn from Python at 35</h2><p>The history of technology is full of ambitious projects backed by massive teams and enormous budgets that failed spectacularly. Python&#8217;s success story is the opposite: a single person with good taste, a clear philosophy, and the patience to let a community grow organically around a shared set of values.</p><p>Van Rossum didn&#8217;t try to make the fastest language, or the most theoretically pure, or the most feature-rich. He tried to make the most <em>humane</em>. And it turned out that optimizing for the programmer&#8217;s experience &#8212; for readability, for simplicity, for the idea that there should be one obvious way to do something &#8212; was the most powerful optimization of all.</p><p>Happy birthday, Python. Thirty-five never looked so good.</p><div><hr></div><p><em>Compiled is a daily newsletter exploring one moment from the history of programming. New issue every morning.</em></p><p><em>Sources: Guido van Rossum&#8217;s &#8220;History of Python&#8221; blog; Python documentation; Computer History Museum; van Rossum&#8217;s original alt.sources post, February 20, 1991.</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Maurizio Morri Substack is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Speculative Decoding and Disaggregated Serving]]></title><description><![CDATA[The New Tricks That Make LLMs Feel Fast]]></description><link>https://www.mauriziomorri.tech/p/speculative-decoding-and-disaggregated</link><guid isPermaLink="false">https://www.mauriziomorri.tech/p/speculative-decoding-and-disaggregated</guid><dc:creator><![CDATA[Maurizio Morri]]></dc:creator><pubDate>Fri, 20 Feb 2026 01:32:47 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!94A5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb64901d-583e-4e70-9356-5674458ef0e7_144x144.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The best serving breakthroughs in AI right now are not new foundation models. They are computer architecture moves applied to inference, because modern decoding is often memory bound. Once prefill is done, every new token is dominated by moving and updating attention state, and the attention state grows with context length. That is why the industry has become obsessed with KV cache, not as an implementation detail, but as the object that decides latency, throughput, and cost.</p><p>The first big lever is speculative decoding. Instead of generating one token at a time with the expensive model, you let a smaller draft model propose multiple tokens, then you verify them with the large model and accept the ones that match. If it sounds like branch prediction, it should. vLLM treats this as a first class feature specifically because it can reduce inter token latency in memory bound regimes, which is exactly where long context systems live. <a href="https://docs.vllm.ai/en/latest/features/spec_decode/?utm_source=chatgpt.com">https://docs.vllm.ai/en/latest/features/spec_decode/</a></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Maurizio&#8217;s Substack! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>What changed recently is that speculative decoding is becoming an ecosystem, not a trick. The SGLang team published SpecForge as a framework for training draft models that port cleanly into their serving stack, which is a tell that serious operators want repeatable workflows, not one off hacks. <a href="https://github.com/sgl-project/SpecForge?utm_source=chatgpt.com">https://github.com/sgl-project/SpecForge</a> AMD&#8217;s developer hub goes further and documents reproducible speculative decoding performance work in a real serving setup, including a concrete speedup claim in their tutorial context. <a href="https://rocm.docs.amd.com/projects/ai-developer-hub/en/latest/notebooks/inference/speculative_decoding_deep_dive.html?utm_source=chatgpt.com">https://rocm.docs.amd.com/projects/ai-developer-hub/en/latest/notebooks/inference/speculative_decoding_deep_dive.html</a></p><p>The second big lever is disaggregated inference, splitting prefill and decode onto different resources so each phase can scale independently. The hard part is obvious the moment you do this: you must transfer the KV cache efficiently from the prefill side to the decode side, and that turns your serving system into a distributed memory system. The DistServe retrospective summarizes how the field evolved after the initial push, and it lists a whole family of follow on systems that focus specifically on KV cache transfer, scheduling, and network constraints. <a href="https://hao-ai-lab.github.io/blogs/distserve-retro/?utm_source=chatgpt.com">https://hao-ai-lab.github.io/blogs/distserve-retro/</a> A separate recent overview frames this evolution as eras of KV cache handling, with disaggregation as the key inflection point because it forces explicit cache movement and cache economics. <a href="https://www.modular.com/blog/the-five-eras-of-kvcache?utm_source=chatgpt.com">https://www.modular.com/blog/the-five-eras-of-kvcache</a></p><p>The third lever is kernel level efficiency, especially attention. If attention is where you spend your memory bandwidth and your time, you want kernels that overlap data movement with compute and exploit hardware features like Tensor Memory Accelerator paths and low precision math. FlashAttention 3 is a good illustration of the direction, focusing on asynchrony and hardware aware scheduling on Hopper class GPUs. <a href="https://pytorch.org/blog/flashattention-3/?utm_source=chatgpt.com">https://pytorch.org/blog/flashattention-3/</a></p><p>These levers converge on one uncomfortable conclusion. Serving is now a memory hierarchy problem. GPU HBM is the hottest tier, but long context pushes you to decide what spills to host memory, what spills to SSD, and what can be reused across requests. That is why you are seeing discussion of extending context storage beyond HBM, and why new memory ideas get framed explicitly for inference. Even the idea of high bandwidth flash is marketed in terms of augmenting HBM for inference workloads, which tells you where operators feel the pain. <a href="https://www.tomshardware.com/pc-components/dram/sandisks-new-hbf-memory-enables-up-to-4tb-of-vram-on-gpus-matches-hbm-bandwidth-at-higher-capacity">https://www.tomshardware.com/pc-components/dram/sandisks-new-hbf-memory-enables-up-to-4tb-of-vram-on-gpus-matches-hbm-bandwidth-at-higher-capacity</a></p><p>Low precision is the final accelerant because it reduces memory footprint and increases throughput, but it only works if you manage accuracy. NVIDIA positions FP8 as a supported datatype for higher throughput on H100 class hardware and documents how to use FP8 and FP4 style formats through Transformer Engine, which is another signal that inference efficiency is becoming standardized, not experimental. <a href="https://docs.nvidia.com/deeplearning/transformer-engine/user-guide/examples/fp8_primer.html?utm_source=chatgpt.com">https://docs.nvidia.com/deeplearning/transformer-engine/user-guide/examples/fp8_primer.html</a></p><p>If you are building serious systems, the implication is that you should benchmark like an infrastructure engineer, not like a model demo. Long context, multi turn sessions, and agentic workloads stress KV cache, cache movement, and concurrency regimes. The wins you will feel in production are increasingly coming from speculative decoding plus smarter cache handling plus kernels that are built around the real bottleneck, which is memory traffic, not raw FLOPS.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Maurizio&#8217;s Substack! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Science, Tech, and the Future of Biology ]]></title><description><![CDATA[Me and Tech]]></description><link>https://www.mauriziomorri.tech/p/science-tech-and-the-future-of-biology</link><guid isPermaLink="false">https://www.mauriziomorri.tech/p/science-tech-and-the-future-of-biology</guid><dc:creator><![CDATA[Maurizio Morri]]></dc:creator><pubDate>Mon, 17 Mar 2025 20:09:51 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!94A5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb64901d-583e-4e70-9356-5674458ef0e7_144x144.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>A bit about me: I have a background in <strong>molecular biology, bioinformatics, and cybersecurity</strong>, and I&#8217;ve spent my career at the frontier of <strong>genomics, aging research, and AI-driven health solutions</strong>. From leading research teams to exploring how computational tools can revolutionize medicine, my work has always been driven by one question:</p><p>&#128073; <em>How can we use technology to decode and extend human potential?</em></p><p>Here, I&#8217;ll be writing about:</p><p>&#128300; Advances in genomics, aging, and computational biology</p><p>&#129504; The role of AI and machine learning in biotech</p><p>&#128640; The challenges and opportunities of building deep-tech startups</p><p>&#128269; Cybersecurity in biomedical research</p><p>If you&#8217;re interested in where science meets innovation, <strong>subscribe</strong> and join the conversation. I&#8217;d love to hear from you.</p><p>Let&#8217;s explore the future together.</p><p>Maurizio</p><div><hr></div><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Maurizio&#8217;s Substack! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Coming soon]]></title><description><![CDATA[This is Maurizio Morri Substack.]]></description><link>https://www.mauriziomorri.tech/p/coming-soon</link><guid isPermaLink="false">https://www.mauriziomorri.tech/p/coming-soon</guid><dc:creator><![CDATA[Maurizio Morri]]></dc:creator><pubDate>Mon, 17 Mar 2025 18:10:13 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!94A5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb64901d-583e-4e70-9356-5674458ef0e7_144x144.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>This is Maurizio Morri Substack.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.mauriziomorri.tech/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.mauriziomorri.tech/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item></channel></rss>