Last updated 3 months ago. Our resources are updated regularly but please keep in mind that links, programs, policies, and contact information do change.
- When America Had No Standards
- The Bureau is Born
- Fire, Safety, and Standards
- The Electrical Revolution
- Radio Days
- The War Years Transform Everything
- The Magic Fuze
- The First Smart Bomb
- Creating an Optical Glass Industry
- SEAC: The Dawn of Computing
- Atomic Clocks Redefine Time
- The First Public Code
- Building the Foundation for Cybersecurity
- A New Name, A New Mission
- Helping Small Manufacturers Compete
- Understanding National Tragedy
- The Key Findings
- Creating a Common Language for Cybersecurity
- Tomorrow’s Challenges Today
- Trustworthy Artificial Intelligence
- The Quantum Threat
- Building with Atoms
Your smartphone knows the time down to a billionth of a second. Your credit card transactions are protected by codes that would take supercomputers thousands of years to crack. School buses are painted that specific shade of yellow so drivers spot them instantly.
These facts all trace back to a federal agency most Americans have never heard of: the National Institute of Standards and Technology.
For more than a century, NIST and its predecessor have quietly built the invisible infrastructure that makes modern life work.
Without fanfare, this Department of Commerce agency has created the fundamental language that allows parts to fit, transactions to be fair, and technologies to talk to each other.
When America Had No Standards
At the dawn of the 20th century, the United States was an industrial powerhouse with a fatal weakness. The country lacked any national standards for measurement.
The chaos was almost comical. Scientists complained about eight different “authoritative” values for the U.S. gallon. At least four different “feet” were in common use. This wasn’t just inconvenient—it was crippling American industry.
The United States had “second-rate measurement infrastructure that lagged behind the capabilities of the United Kingdom, Germany, and other economic rivals,” according to NIST’s official history. American electrical instrument makers had to send their products to Germany’s national standards lab for certification. It was a costly symbol of scientific dependence.
The problem wasn’t new. The Constitution grants Congress the power to “fix the Standard of Weights and Measures.” President George Washington declared in 1790 that “uniformity in the currency, weights, and measures of the United States is an object of great importance.” John Quincy Adams called weights and measures “among the necessities of life to every individual of human society.”
Yet for over a century, this constitutional mandate went largely unfulfilled.
The Bureau is Born
Samuel Wesley Stratton, a University of Chicago physics professor, changed everything. Recruited to Washington in 1899 to revitalize the government’s weights and measures work, Stratton quickly realized that simple reorganization wouldn’t suffice.
He envisioned a new, independent institution—a national standards laboratory on par with those in Europe. Stratton became the driving force behind creating the National Bureau of Standards. He drafted the arguments, assembled an unprecedented coalition of scientists and industrialists, and “mesmerized the House Committee” with his passionate case for the new agency.
Congress was convinced. On March 3, 1901, it passed the Organic Act, establishing the National Bureau of Standards as the federal government’s first physical science research laboratory. With an initial staff of just 12 and Stratton as its first director, the fledgling agency set out to build the nation’s technical foundation from scratch.
Fire, Safety, and Standards
The new Bureau proved its worth by tackling immediate, high-stakes problems that directly affected American lives.
The Great Baltimore Fire of 1904 provided a brutal lesson in the consequences of lacking standards. For more than 30 hours, fire raged through the city, destroying over 1,500 buildings. Fire departments from Washington, D.C., and New York City rushed to help, only to discover their hoses wouldn’t fit Baltimore’s hydrants.
The Bureau launched an exhaustive study and found more than 600 different fire hose fitting variations across the United States. Working with the newly formed National Fire Protection Association, the Bureau championed a national standard for fire hydrant connections. This single act of standardization ensured that fire companies could work together seamlessly—a legacy that continues protecting American communities today.
The Bureau brought similar order to the marketplace. In 1905, it convened the first National Conference on Weights and Measures, bringing together state officials to create model laws, distribute uniform physical standards, and train inspectors. This methodical work was the antidote to the commercial chaos of competing “gallons” and “feet.”
The Electrical Revolution
As technology advanced, so did the Bureau’s scope. The electrical industry presented both immense promise and novel dangers.
Coal mining, one of the first industries to adopt electric lighting, quickly recognized that electrical sparks in gassy mines could be catastrophic. In 1909, the industry turned to the Bureau for help. The result was the nation’s first model electrical safety code, published in 1915, establishing the agency’s role in public safety.
To support manufacturing, the Bureau pioneered modern quality control. In 1906, at the American Foundrymen’s Association’s request, the Bureau began producing “standard samples” of standardized irons. These samples, later known as Standard Reference Materials, allowed different foundries to calibrate their instruments against a common benchmark, ensuring consistent quality.
Radio Days
The Bureau was also at the forefront of the radio revolution.
During World War I, Bureau researchers patented an improved radio direction finder—an antenna that could pinpoint radio transmission sources. The device served as a prototype for the U.S. Navy and was widely used to locate enemy ships and submarines.
After the war, commercial radio broadcasting exploded, turning the airwaves into a cacophony of interfering signals. To solve this problem, the Bureau launched its own radio station, WWV, in 1923. From Maryland, WWV began broadcasting standard frequencies, allowing commercial stations to calibrate their equipment and stay in their assigned lanes, bringing order to the radio spectrum.
| Everyday Activity | NIST’s Contribution |
|---|---|
| Fire Safety | Standardized fire hose couplings after the 1904 Baltimore fire, ensuring interoperability between fire departments |
| Food & Nutrition | Provides Standard Reference Materials so food producers can accurately measure nutrients like vitamins and minerals in products from cereal to pet food |
| Timekeeping (GPS, Internet) | Developed the world’s first atomic clocks, which provide the basis for Coordinated Universal Time (UTC) that synchronizes GPS, cell networks, and the internet |
| Automotive Safety | Research in the 1960s proved the life-saving effectiveness of seat belts, leading to laws requiring them. Develops tests for modern crash-warning systems |
| Medical Care | Establishes national standards for accurate radiation dosage in X-rays and ensures pharmaceutical companies can precisely measure medicine in each pill |
| Secure Communications | Developed the first public encryption standards (DES) and continues to lead the world in creating new standards (AES, Post-Quantum) that protect online banking, email, and messaging |
| School Buses | Helped standardize the iconic “National School Bus Glossy Yellow” color in 1939 for high visibility and safety |
| Digital Photography | Built one of the first computers, SEAC, which was used to create the world’s first digital image, a precursor to every selfie and digital photo today |
The War Years Transform Everything
World War II transformed the National Bureau of Standards from a measurement agency into a powerhouse of invention. The existential threat to the nation forced the agency beyond its traditional role and into direct, high-stakes innovation.
Even before Pearl Harbor, the Bureau was preparing. On September 1, 1939—the day Germany invaded Poland—Director Lyman J. Briggs sent a memo to the Department of Commerce outlining services the Bureau was ready to provide “in the event of war.” The agency was prepared to test strategic materials, certify instruments for export, and dramatically increase production of critical supplies like optical glass.
Once the U.S. entered the war, the Bureau became vital to the nation’s scientific mobilization. Dozens of war-related projects scattered throughout its laboratories, but two top-secret projects stand out for their revolutionary battlefield impact.
The Magic Fuze
The first breakthrough was the radio proximity fuze. Before its invention, artillery shells and bombs were detonated by timers or on direct impact—both highly inefficient methods.
The proximity fuze was a miniature radio transceiver housed in a projectile’s nose. It emitted radio waves and triggered the explosive at optimal altitude when the reflected signal from a target reached a certain strength. This “magic fuze” was a game-changer, dramatically increasing the lethality of anti-aircraft fire against Japanese planes in the Pacific and artillery against German ground troops in Europe.
By war’s end, 8 million of these Bureau-developed fuzes had been produced.
The First Smart Bomb
The second innovation was the “Bat” guided missile, one of the world’s first “smart bombs.” This radar-guided glide bomb was designed to be launched from a plane outside enemy anti-aircraft gun range. The Bat emitted shortwave radiation and would “home” in on radar signals reflected from its target, typically an enemy ship.
It was a remarkable early example of autonomous weapon technology, representing a major leap forward in precision-guided munitions.
Creating an Optical Glass Industry
One of the Bureau’s most critical wartime contributions was in materials science, specifically optical glass production.
In the early 20th century, the United States was almost entirely dependent on European manufacturers for high-quality glass needed for military optics—the lenses and prisms inside periscopes, binoculars, rangefinders, and bombsights. When World War I cut off this supply, it created an acute national security crisis.
When World War II loomed, the nation faced the same crisis but on a much larger scale. The Bureau undertook the monumental task of creating a domestic optical glass industry virtually from scratch.
This wasn’t simple recipe-following. Creating flawless, high-purity glass with precise optical properties is an exacting science. Bureau scientists had to develop and perfect every stage in-house, from formulating batch compositions and designing clay melting pots to defining precise melting schedules, molding procedures, and annealing techniques to prevent internal stresses.
The results were staggering. Before the war, the Bureau produced five types of optical glass for the Navy. During the war, it expanded to 28 different types, supplying the vast needs of the U.S. Armed Forces. On September 1, 1939, the Bureau reported it was prepared to increase output from 9,000 pounds per year to 75,000 pounds per year.
This effort ensured that American soldiers, sailors, and airmen had the high-quality optical instruments they needed to effectively target the enemy.
SEAC: The Dawn of Computing
The war’s end didn’t end demand for the Bureau’s advanced research. The massive computational challenges of the post-war era required a new kind of tool.
In the late 1940s, with U.S. Air Force funding, the Bureau embarked on designing and building one of the world’s first electronic computers.
The result was the Standards Eastern Automatic Computer (SEAC). Completed in 1950, it’s widely considered the first fully operational, stored-program electronic computer in the United States. At a time when computers were built with thousands of bulky, power-hungry vacuum tubes, SEAC was a technological pioneer.
It was the first computer to perform all logic functions using solid-state devices—10,000 germanium diodes. This was a critical step away from vacuum tubes and toward the smaller, faster, more reliable electronics that define the modern era.
SEAC was immediately put to work on the nation’s most complex problems. In a landmark 1947 experiment, Bureau mathematicians manually performed calculations for the “simplex method,” a new algorithm for solving optimization problems, to determine the cheapest possible diet that met nutritional requirements.
Later, SEAC ran the first large-scale computational tests of the simplex method, proving its superiority over competing methods and establishing a foundational tool of operations research.
Over its 13-year tenure, SEAC calculated sampling plans for the Census Bureau, computed stresses in aircraft structures, and determined wave functions for atoms. In a moment of remarkable foresight, SEAC was also used to create the world’s first digital image—a scanned photograph of a baby. This simple experiment was the ancestor of every digital photo, satellite image, and medical scan we see today.
Atomic Clocks Redefine Time
Perhaps no single Bureau invention from this era had a more profound impact than the atomic clock.
For millennia, humanity measured time by the movements of the heavens—Earth’s rotation on its axis. But Earth’s rotation isn’t perfectly stable. For the ultra-precise needs of modern science and technology, a better standard was needed.
The revolutionary idea came in 1945 from Columbia University physicist Isidor Rabi, who suggested that atoms’ natural, unvarying vibrations could be used as the ultimate timekeeping “pendulum.” Bureau scientists, with their world-leading expertise in microwave spectroscopy, were uniquely positioned to turn this idea into reality.
In 1949, a Bureau team led by Harold Lyons announced a historic breakthrough: the world’s first atomic clock. This device used ammonia molecule vibrations inside a waveguide to control a quartz crystal oscillator’s frequency. While groundbreaking, ammonia proved tricky. The team quickly turned to a more stable element: cesium.
In 1952, the Bureau announced the first cesium atomic clock, known as NBS-1. This machine was far more accurate. After being moved to the Bureau’s new Boulder, Colorado laboratories in 1954, NBS-1 began regular service as the nation’s primary frequency standard in 1959.
Bureau work, along with parallel efforts at England’s National Physical Laboratory, launched a timekeeping revolution. Over the next decade, the Bureau built increasingly accurate cesium clocks: NBS-2, NBS-3, and in 1968, the world-renowned NBS-4, which was so stable it served as a core component of the U.S. time system for over two decades.
This pioneering research culminated in a paradigm shift for global science. In 1967, the 13th General Conference on Weights and Measures formally abandoned the astronomical definition of the second. The fundamental unit of time was redefined based on cesium-133 atom vibrations: precisely 9,192,631,770 oscillations.
For the first time, the world’s timekeeping was no longer tied to our planet’s wobbling, but to a fundamental, immutable constant of the universe itself.
The First Public Code
As the United States transitioned from the industrial age to the information age, the Bureau found itself at the forefront of another technological revolution. Long before most Americans had touched a computer, Bureau scientists were grappling with fundamental digital world challenges: how to protect information, ensure privacy, and manage risks of a new, interconnected society.
In the early 1970s, computers were beginning to move out of government laboratories into mainstream business and finance. Banks, corporations, and federal agencies were starting to store and transmit sensitive data electronically. This created an urgent new problem: how to protect this information from unauthorized access.
At the time, encryption was almost exclusively the secret domain of military and intelligence organizations. There was no widely accepted, publicly available method for securing commercial data.
Recognizing this critical gap, the Bureau’s Institute for Computer Sciences and Technology initiated a landmark project in 1972 to develop a federal cryptographic standard. In 1973, the Bureau issued a public call for proposals for a suitable encryption algorithm.
The initial response was disappointing. But a second call in 1974 yielded a promising candidate from IBM: an algorithm known as “Lucifer,” developed by a research team led by Horst Feistel.
What followed was both revolutionary and controversial. The Bureau, in secret consultation with the National Security Agency, began evaluating and modifying the Lucifer algorithm. This collaboration immediately raised red flags in the nascent academic computer science community.
The most contentious change was reducing the key size. Lucifer had used a 128-bit key, but the version proposed for the new standard had its key shortened to 56 bits.
Critics loudly voiced two primary concerns. First, they argued that a 56-bit key was too short and would soon be vulnerable to a “brute-force” attack, where an adversary simply tries every possible key until finding the correct one. Second, they feared that the NSA had inserted a hidden “trapdoor” into the algorithm’s complex internal functions, which would allow the agency to decrypt messages even without the key.
Despite intense debate, the Bureau moved forward, holding two public workshops in 1976 to discuss the algorithm’s mathematical foundations and utility. On November 23, 1977, the Data Encryption Standard (DES) was issued as Federal Information Processing Standard 46.
Its adoption marked a radical departure from cryptography history. For the first time, a powerful, government-backed encryption algorithm was made completely public. Every detail of its operation was published, allowing anyone to study it, critique it, and implement it.
This open approach had a profound impact. DES was mandated for protecting unclassified U.S. government data and became mandatory for all electronic fund transfers in the federal banking system. It was quickly adopted by ANSI and other standards bodies, becoming the international standard for protecting financial and commercial data for two decades.
More importantly, DES’s public nature “jump-started” non-military cryptography study. It gave academic researchers a common, robust algorithm to analyze, attack, and learn from. An entire generation of cryptographers cut their teeth on DES, and it became the benchmark against which every subsequent symmetric-key algorithm has been measured.
This process established a new paradigm: security through public scrutiny rather than obscurity—a principle that defines NIST’s approach today.
Building the Foundation for Cybersecurity
DES was part of a much broader effort at the Bureau to address the security and privacy implications of the computer age. The agency’s work was driven by new legislation reflecting society’s growing concerns about data. The Privacy Act of 1974 established requirements for federal agencies to protect personally identifiable information, giving the Bureau a clear mandate to develop needed technical guidelines.
Throughout the 1970s and 1980s, the Bureau published seminal documents that laid the conceptual groundwork for modern cybersecurity, addressing issues decades before they became mainstream public concerns.
In 1977, the Bureau published guidance on using passwords for controlling access to computer resources, and in 1985, it issued the first Password Usage Standard, which included concepts like password composition and lifetime that remain fundamental to user authentication systems today.
In 1979, the Bureau published one of the first formal methodologies for conducting risk analysis for automatic data processing systems. This moved computer security beyond a simple checklist of controls toward a more strategic, risk-based approach.
Recognizing that security incidents were inevitable, the Bureau issued a guide to contingency planning in 1981, helping federal agencies think systematically about preparing for and recovering from disruptions to their computer systems.
With the rise of the PC, security threats moved from the centralized mainframe to the desktop. In 1985, the Bureau published “Security of Personal Computer Systems: A Management Guide,” one of the first documents to provide practical security advice for the new world of decentralized computing.
This foundational work culminated in the Computer Security Act of 1987. This landmark legislation formally assigned the Bureau primary responsibility for developing standards and guidelines for unclassified federal computer systems security.
A New Name, A New Mission
The 1980s were a period of profound economic anxiety for the United States. American industries, once undisputed world leaders, found themselves losing ground to fierce international competition. The nation’s balance of trade in manufactured goods had plummeted from a surplus in 1981 to a staggering deficit by mid-decade.
A deluge of government and industry reports warned that America was falling behind in key technology areas, succumbing particularly to Japan’s ability to rapidly commercialize U.S. inventions and manufacture high-quality products with stunning efficiency.
This growing realization that the world economy’s structure had changed, undermining America’s traditional advantages, spurred Congress to act. The response would trigger the most dramatic transformation in the Bureau’s history.
The legislative vehicle was the Omnibus Trade and Competitiveness Act of 1988, a massive bill signed into law by President Ronald Reagan on August 23, 1988. Buried within its 460 pages was a subpart that would fundamentally redefine the purpose and identity of the nation’s standards laboratory.
First, it changed the agency’s name. The National Bureau of Standards became the National Institute of Standards and Technology (NIST). The change was far more than symbolic. It signaled a profound shift in mission.
Second, the act gave NIST a new, proactive mandate. While preserving its traditional functions in measurement science and standards, the law assigned the agency a daunting new task: “to augment its unique ability to enhance the competitiveness of American industry.”
The purpose was to help U.S. companies, particularly small and medium-sized firms, develop and capitalize on advanced technologies, modernize their manufacturing processes, and accelerate the commercialization of research breakthroughs.
To carry out this expanded mission, the act created two major new programs:
The Advanced Technology Program (ATP): This program was designed to act as a catalyst for innovation by providing seed funding for high-risk, high-reward research and development projects. The ATP would form cost-sharing partnerships with private industry to bridge the gap between basic research and commercial products.
The Manufacturing Extension Partnership (MEP): This initiative was created to establish a nationwide network of centers dedicated to helping America’s vast base of small and medium-sized manufacturers adopt new technologies and improve their performance.
Helping Small Manufacturers Compete
The Manufacturing Extension Partnership addressed a classic “market failure” seen as a key weakness in the U.S. industrial ecosystem. While large manufacturers were at the cutting edge of technology, hundreds of thousands of smaller firms that made up their supply chains often lacked the resources, expertise, and time to invest in modernizing their processes.
These small and medium manufacturers—which in 2016 comprised 99% of all manufacturing establishments and employed nearly three-quarters of the manufacturing workforce—were the essential foundation for any potential resurgence in American manufacturing.
The MEP program was designed to correct this imbalance. It was established as a unique federal-state partnership, with NIST providing a portion of the funding and technical oversight, and the remainder coming from state contributions and fees paid by companies for services.
The program consists of a network of 51 centers, one in every state and Puerto Rico, managed by non-profit organizations, state agencies, or universities. With close to 1,300 technical experts operating out of 600 locations, these centers provide hands-on, affordable assistance to local manufacturers in areas critical to their success: cost reduction through lean manufacturing, sales growth through innovation and exporting, quality improvement through ISO standards, and workforce development.
The impact has been significant and quantifiable. An independent firm surveys MEP clients quarterly to measure the program’s results. In Fiscal Year 2016 alone, the program was credited with helping client companies achieve $9.3 billion in new and retained sales, $1.4 billion in cost savings, and the creation or retention of 86,602 jobs.
This success translates into a powerful return on investment for taxpayers. The same 2016 data showed that the federal investment in MEP generated a nearly nine-fold increase in federal personal income tax revenue—an 8.7:1 return.
Understanding National Tragedy
As the United States entered the 21st century, NIST’s role continued evolving, solidifying its position as the nation’s trusted, non-regulatory arbiter for complex and high-stakes technological challenges.
In the immediate aftermath of the September 11, 2001, terrorist attacks, a stunned nation demanded answers. Amid the grief and confusion, a critical technical question emerged: Why did the World Trade Center towers, massive structures designed to withstand significant impacts, collapse so completely?
Answering this question was essential for understanding the tragedy and ensuring future building safety.
In response to calls from Congress and the public, the National Construction Safety Team Act was passed in 2002. This law gave NIST the authority and responsibility to conduct a comprehensive, fact-finding technical investigation into the collapses of the Twin Towers and the nearby 47-story WTC 7.
The investigation was a monumental undertaking, lasting several years and involving more than 200 technical experts, including 85 NIST staff members. The team meticulously gathered and analyzed every available piece of evidence. They reviewed tens of thousands of pages of design and construction documents, conducted interviews with 1,056 surviving occupants and 116 emergency responders, and analyzed 236 pieces of structural steel recovered from the wreckage. They also compiled and studied thousands of photographs and over 150 hours of video footage to reconstruct the events of that day.
After years of painstaking analysis, computer modeling, and large-scale fire tests, NIST released its final reports—43 volumes on the Twin Towers in 2005 and 3 volumes on WTC 7 in 2008.
The Key Findings
The collapses were not caused by the aircraft impacts alone. NIST concluded that the towers were remarkably robust and withstood the initial impacts, redistributing the loads from severed columns.
The critical factor was the widespread dislodging of fireproofing. The impacts and resulting debris fields stripped the lightweight, spray-on fire-resistive material from the steel columns and floor trusses over multiple floors.
Subsequent fires weakened the unprotected steel. The thousands of gallons of dispersed jet fuel ignited intense, multi-floor fires that reached temperatures as high as 1,000 degrees Celsius (1,800 degrees Fahrenheit). Without their protective insulation, the heated steel floor trusses began to sag significantly.
The sagging floors initiated the collapse. As the floors sagged, they pulled inward on the massive perimeter columns. This inward pull, combined with damage from the impacts and heat from the fires, caused the exterior walls to buckle. This failure of a single exterior wall was enough to trigger a rapid, progressive collapse of the entire structure above the impact zone, which then fell onto the floors below.
For WTC 7, which was not struck by an aircraft, the cause was different but related. Debris from the North Tower’s collapse ignited fires on multiple floors of WTC 7. These fires burned uncontrolled for nearly seven hours. The prolonged heating caused a critical floor girder to expand and push a key structural column off its seat. This single failure triggered a cascade of floor failures, which led to the progressive collapse of the entire building.
The legacy of the WTC investigation has been profound. NIST issued 31 recommendations for improvements to building and fire codes, standards, and practices. While NIST has no regulatory authority, these science-based recommendations have been widely adopted, leading to significant changes in design and construction practices for high-rise buildings around the world.
Creating a Common Language for Cybersecurity
Just as it had done with DES decades earlier, NIST stepped in to provide a common language for a new era of digital risk. By the 2010s, cybersecurity had evolved from a technical niche into a critical issue of national and economic security.
In 2013, President Barack Obama issued Executive Order 13636, “Improving Critical Infrastructure Cybersecurity,” which called for developing a voluntary, risk-based framework to help organizations manage cyber threats.
NIST was tasked with leading this effort precisely because it was a non-regulatory agency with a long history of successfully convening industry, academia, and government to solve complex national problems. Through a year-long, open process of workshops and public comment, NIST developed the Cybersecurity Framework, first released in 2014.
The Framework’s power lies in its simplicity and flexibility. It’s not a rigid, one-size-fits-all mandate. Instead, it provides a common vocabulary and structured approach that any organization—regardless of size, sector, or technical sophistication—can use to understand, manage, and communicate its cybersecurity risks.
The Framework is organized around five core functions:
Identify: Understand your business, assets, data, and the cybersecurity risks you face.
Protect: Implement appropriate safeguards to ensure the delivery of critical services.
Detect: Develop and implement activities to identify the occurrence of a cybersecurity event.
Respond: Have a plan to take action once a cybersecurity incident is detected.
Recover: Have a plan for resilience and to restore capabilities after an incident.
The Framework was an immediate and global success. It has been downloaded millions of times and is used by thousands of organizations worldwide, from small businesses to Fortune 500 companies and foreign governments, as the gold standard for cybersecurity risk management.
In February 2024, reflecting the evolution of the cybersecurity landscape, NIST released Framework 2.0. The most significant update was the addition of a new, central function: Govern. This new pillar emphasizes that cybersecurity is not just a technical issue for the IT department; it’s a core responsibility of executive leadership and a fundamental aspect of enterprise risk management.
The Govern function provides outcomes related to establishing an organization-wide cybersecurity strategy, defining roles and responsibilities, setting policies, and managing cybersecurity in the supply chain. Its addition marks the maturation of cybersecurity from a technical discipline to a critical element of corporate governance.
Tomorrow’s Challenges Today
Today, NIST is working at the cutting edge of science and technology, developing the measurement science and standards needed to harness the benefits of the 21st century’s most transformative technologies while mitigating their risks.
Trustworthy Artificial Intelligence
As AI systems become more powerful and integrated into society, ensuring they are safe, reliable, fair, and secure is a paramount challenge. NIST is leading the nation’s effort to build trust in AI.
In 2023, it released the AI Risk Management Framework, a voluntary guide for organizations that design, develop, or use AI systems to help them manage the vast spectrum of AI-related risks. Like the Cybersecurity Framework, the AI Framework provides a structured, flexible approach focused on four key functions: Govern, Map, Measure, and Manage.
NIST is also conducting foundational research into measuring and mitigating AI bias, improving the explainability of “black box” models, and securing AI systems from novel attacks. Through its new U.S. AI Safety Institute, NIST is charged with developing evaluation methods for the most advanced AI models to assess their capabilities and risks.
The Quantum Threat
The digital world faces a looming cryptographic apocalypse. Scientists predict that within the next decade or two, the development of large-scale quantum computers will render obsolete the public-key encryption that currently protects virtually all secure data on the internet, from financial transactions to government secrets.
A powerful quantum computer running Shor’s algorithm could easily break today’s standards. This creates an urgent “harvest now, decrypt later” threat, where adversaries can collect encrypted data today with the confidence that they will be able to decrypt it in the future.
To avert this crisis, NIST initiated a multi-year, global competition in 2016 to select a new generation of “quantum-resistant” encryption algorithms—algorithms based on mathematical problems believed to be hard for both classical and quantum computers to solve.
After years of intense public scrutiny and analysis by the world’s top cryptographers, NIST announced its first suite of post-quantum cryptography standards in August 2024. These standards—ML-KEM for general encryption and ML-DSA and SLH-DSA for digital signatures—are now finalized and ready for organizations to begin the long process of transitioning their systems to a new, quantum-secure foundation.
Building with Atoms
The ability to see and manipulate matter at the scale of individual atoms—nanotechnology—is driving innovation across countless fields, from next-generation electronics and quantum computing to advanced materials and targeted drug delivery.
But to build reliable nanotechnology, you must first be able to measure it reliably. NIST provides the essential measurement science that underpins this entire field. This includes developing standard protocols for preparing and measuring nanomaterials, creating nanoscale Standard Reference Materials to ensure measurement accuracy, and operating the Center for Nanoscale Science and Technology NanoFab, a state-of-the-art user facility where researchers from industry and academia can access cutting-edge fabrication tools.
This foundational work is critical for turning the immense promise of nanotechnology into the tangible products and industries of the future.
Our articles make government information more accessible. Please consult a qualified professional for financial, legal, or health advice specific to your circumstances.