1. Accelerated Materials Discovery
AI greatly speeds up the search for new materials by automating the initial screening of vast chemical and structural spaces. Machine learning models trained on existing databases can quickly evaluate millions of hypothetical compounds, focusing experiments on only the most promising candidates. This reduces the years of trial-and-error work traditionally needed to find novel materials. For example, AI-guided high-throughput searches have enabled researchers to survey entire composition families and crystal structures much faster than manual methods. In practice, this means that researchers can explore unconventional chemistries and complex alloys far more efficiently, accelerating the pace of discovery across materials science.

Large-scale studies demonstrate orders-of-magnitude gains in discovery efficiency using AI. Merchant et al. (2023) trained graph neural networks on ~48,000 known stable crystals and found they could predict about 2.2 million new candidate structures below the convex hull, vastly expanding the catalog of stable materials. Remarkably, 736 of those AI-predicted compounds were later confirmed by experiment, validating the predictions. In another example, Gao et al. (2024) used a neural-network “search engine” to identify 50 previously unknown altermagnetic materials (spanning metals, semiconductors, insulators) via first-principles confirmation. Autonomous labs have also put these ideas into practice: Fei et al. (2023) built an AI-driven synthesis platform (“A-Lab”) that, over a 17-day run, successfully made 41 new inorganic compounds out of 58 AI-suggested targets. The combination of ML-guided screening and automated synthesis thus enabled discovery of dozens of novel materials in weeks instead of years.
2. Predictive Property Modeling
AI models are used to predict material properties (mechanical, thermal, electronic, etc.) before a material is made. These models act as fast surrogates for expensive physics calculations or experiments, giving researchers quick estimates of key parameters. For instance, a neural network might predict a metal’s tensile strength or a ceramic’s thermal conductivity from its composition and structure. Accurate property predictions let scientists rule out poor candidates early. In many cases, the AI predictions match the fidelity of density functional theory (DFT) or experiments, enabling high-throughput virtual screening. This predictive capability streamlines development by identifying top candidates for synthesis based on target performance, and avoids costly trial-and-error on unlikely compositions.

Recent studies report high accuracies for AI property predictions, often rivaling traditional methods. Li et al. (2024) note that ML models have achieved “DFT-level accuracy” for properties like formation energy, and even R² > 0.95 in thermal conductivity using fewer than 100 training examples. For example, one neural model predicted formation energy with a mean absolute error (MAE) of ~0.064 eV/atom versus ~0.076 eV/atom for DFT, effectively outperforming the original DFT reference. Multitask neural networks have been reported to compute total energy, charge density, and magnetic moment in a given structure hundreds of times faster than direct DFT, with comparable accuracy. Graph neural networks, trained on large materials datasets, have delivered more accurate estimates of formation energies, bandgaps, and elastic moduli than standard DFT when evaluated on broad test sets. These results show AI models can rapidly and reliably predict material properties, substantially reducing the need for expensive simulations or experiments in the early screening stage.
3. Inverse Design
Inverse design flips the discovery problem: it starts from desired material properties and works backward to propose candidate chemistries. AI algorithms (often generative models or optimization routines) suggest new compositions or structures predicted to meet specific targets. For example, one might specify a target bandgap or stiffness, and the model generates formulas or crystal prototypes likely to achieve it. This removes much of the human guesswork and explores unconventional solutions. Modern generative models can navigate chemical space more creatively, blending elements or structures in novel ways. Inverse design effectively automates the ideation of materials: rather than test many random compositions, AI recommends the most promising candidates with the right attributes.

Inverse design has led to concrete materials discoveries validated by computation and experiment. Zheng et al. (2024) used a graph autoencoder to design new polymer networks (vitrimers) targeting specific glass-transition temperatures (Tg). Their model generated candidates with Tg far beyond the original training range (one ~569 K, another ~248 K). For a target of 323 K, the AI proposed a polymer that was synthesized and measured to have Tg ≈311–317 K. Likewise, Wang et al. (2025) developed “MatterGen,” a diffusion-based generative model for crystals. Fine-tuned on a small dataset, MatterGen found 106 distinct hypothetical structures with extremely high bulk moduli (superhard materials) using only 180 DFT evaluations, whereas a brute-force screening of the same chemical space found only 40 such structures. These successes illustrate that inverse-design AI can propose entirely new materials with targeted properties, and those candidates can be validated by first-principles calculations or experiments.
4. Multi-Scale Modeling
Machine learning bridges the gap between different simulation scales, from atomic to macro. AI-based interatomic potentials or surrogate models replace costly quantum simulations, allowing larger and longer-scale atomic simulations. These atomic-scale predictions can feed into higher-level continuum models for materials behavior. In practice, ML models learn effective rules that implicitly capture fine-scale physics and propagate that information upward. Thus AI supports consistent, multiscale workflows (e.g. feeding molecular dynamics results into meso-scale mechanics). The result is a more integrated understanding of materials: AI couples nano-level structures to bulk properties, enabling predictions of how atomistic changes influence large-scale performance.

Recent ML interatomic potentials can reproduce quantum accuracy across larger systems. Maruf et al. (2025) introduced a long-range equivariant potential (NequIP-LR) that explicitly models charge transfer. Their model significantly outperforms standard short-range ML potentials in predicting energies and forces on benchmarks. Because ML potentials evaluate forces orders-of-magnitude faster than DFT, they can simulate millions of atoms to capture nanoscale phenomena, and those results can inform continuum models. For example, ML-accelerated simulations of grain boundaries or phase interfaces can be used in finite-element models of mechanical behavior. These advances mean that detailed atomistic information (e.g. defect structures, local stress) can be propagated to predict materials behavior at engineering scales, thanks to AI models that operate across scales.
5. Enhanced Materials Databases
AI and NLP dramatically expand and enrich materials data repositories. Automated tools now scrape literature and patents to extract chemical compositions, structures, and properties, creating large, curated databases. For example, NLP pipelines can read papers and identify material formulas, synthesis conditions, and performance metrics, which are then entered into structured databases. AI can also represent these databases in latent space for similarity searches, linking related materials. These efforts fill gaps in existing databases and keep them updated with new findings. The result is a far more comprehensive data foundation: instead of manual curation, millions of data points can be added via AI. Such richer databases feed back into AI models, improving all downstream predictions.

Many automated pipelines have been demonstrated. Jiang et al. (2025) review NLP systems that identify material entities, compositions, properties, and processing steps from scientific text. These systems output structured records (e.g. element ratios, dopant levels, measured properties) ready for ML training. Hao et al. (2024) built a “zero-shot” AI agent (Eunomia) that automatically extracts MOF compositions, dopant content, and property data from literature. It performed on par with prior specialized extraction methods and generated ML-ready datasets with minimal human intervention. The authors even open-sourced the agent and data. Additionally, NLP can embed textual and numeric data into vectors: for instance, “dense embeddings” of material formulas and terms have been used to compute similarity between materials and assist discovery. Overall, AI-driven data harvesting significantly boosts the scale and connectivity of materials databases.
6. Uncovering Hidden Relationships
AI can reveal subtle patterns and analogies that humans might miss, linking disparate materials concepts. By learning high-dimensional correlations, AI models uncover non-obvious structure–property links or analogies across fields. For instance, generative models and knowledge graphs integrate data from multiple domains, identifying shared patterns between biology, art, and materials science. This helps scientists form new hypotheses: an AI might suggest that materials with specific motif patterns exhibit similar behaviors, even if they come from unrelated chemistry. In this way, AI-driven analysis exposes hidden “design rules” by connecting previously unlinked concepts, guiding researchers toward innovative ideas.

Graph-based AI has demonstrated this capability in practice. Buehler (2024) built a graph-generative model trained on interdisciplinary data. It identified unexpected analogies across art, biology, and materials. In one example, the AI mapped patterns from Kandinsky’s abstract painting into a new material design: it recommended a mycelium-based composite that balances chaos and order, with adjustable porosity and strength, inspired by the painting’s structure. As the author notes, the system “achieves a far higher degree of novelty” by revealing “hidden connections” that conventional methods cannot see. In another case, the AI found structural similarities between living tissues and music (Beethoven’s Ninth) at the level of organizational complexity. These demonstrations highlight how AI can bridge domains and suggest materials concepts that would not be obvious from chemistry alone, unlocking creative design routes.
7. Guiding Experimental Synthesis
AI not only predicts materials but actively guides how to make them in the lab. Closed-loop systems use ML to suggest synthesis routes, adjust processing parameters, and interpret interim results on the fly. For example, AI can recommend optimal temperatures, pressures, or precursor choices for a desired compound. When integrated with robotics or high-throughput equipment, these algorithms form “self-driving” labs that conduct experiments autonomously. The AI evaluates outcomes (success or failure) and plans the next experiment, continually steering the process toward success. This means experiments are executed more efficiently, with AI focusing efforts on experiments most likely to yield novel materials.

Autonomous laboratories have proven this concept. Fei et al. (2023) report an AI-driven solid-state lab that combines robotic synthesis with AI planning. They used large-scale DFT datasets to propose 58 target oxide and phosphate compounds. A natural-language AI model generated initial recipes from literature, and active learning (Bayesian optimization) refined conditions. Over 17 days, the system successfully synthesized 41 of the 58 targets. This ~70% success rate in making new compounds demonstrates that AI can effectively plan and optimize experiments. Importantly, the AI also analyzed failed attempts to suggest improvements to the synthesis protocols. Such results highlight that AI guidance—both in selecting targets and tuning synthesis—can greatly accelerate experimental throughput and yield.
8. Surface and Interface Optimization
AI assists in finding optimal surface and interface structures for improved performance. For example, ML models can predict how different surface terminations or coating materials will affect corrosion resistance, catalytic activity, or electronic contact. By learning from atomistic simulations or experiments, AI suggests surface nanostructures or alloy compositions that maximize desired traits (e.g. maximizing adhesion or minimizing electron scattering). Additionally, ML-based interatomic potentials enable rapid simulation of surface energies and phase stability under various conditions. In essence, AI helps engineers and scientists tune nanoscale surface features – such as roughness or composition gradients – to achieve optimal interface behavior without exhaustive trial-and-error.

Recent reviews and studies highlight ML’s impact on surfaces. Noordhoek and Bartel (2024) note that learned interatomic potentials can predict surface phase diagrams and morphologies far faster than first-principles methods. As these ML potentials become more accurate (trained on large datasets), they make it feasible to model complex surface reconstructions and interfaces at realistic scales. For instance, ML models can capture how temperature and chemical environment change which surface facet is most stable, guiding material processing to achieve that surface. In catalysis, similar ML approaches allow screening of many surface compositions to find the ones with optimal activity and selectivity. By accelerating the prediction of surface properties, these AI tools help optimize coatings and interfaces for electronics, energy materials, and structural applications.
9. Quality Control and Defect Detection
AI enhances manufacturing quality by detecting defects and inconsistencies in materials. Computer-vision algorithms can inspect surfaces or cross-sections for cracks, inclusions, or voids much faster than humans. Likewise, ML models analyze sensor data (sound, vibration, etc.) to spot anomalies during production. In characterization equipment, AI can flag faults in real time. By learning from image and sensor datasets, AI systems classify defects with high accuracy and speed. This means that production lines can continuously monitor quality and automatically reject flawed parts, improving overall yield. Ultimately, AI-driven inspection and monitoring ensure materials meet specifications with less human oversight.

Many deep-learning methods have proven effective in defect detection. For example, convolutional neural networks have been trained to identify tiny cracks and pits on metal surfaces from images with over 90% accuracy (manufacturing studies). In microscopy, Kalinin et al. (2023) show that ML is key for real-time analysis of scanning transmission electron microscopy data. They highlight the move toward “active ML,” where algorithms segment images and identify atomic-scale defects on-the-fly. The AI can then decide where to focus the next measurement. These capabilities allow closed-loop microscopy and production-quality inspection, automatically detecting imperfections that would be impractical to find manually. In summary, AI models act as smart sensors, continuously ensuring material quality at every stage.
10. Data-Driven Material Genomics
AI treats materials discovery like a genome project. Vast digital databases of compositions and properties serve as the “genome” of materials, and AI learns the underlying rules (or “genes”) that determine performance. Machine learning sifts through these databases (e.g. from the Materials Project, OQMD, etc.) to find composition–property trends or hidden dimensions. Embedding techniques map materials into a continuous space, allowing comparison and analogy-finding akin to bioinformatics. In this data-driven genomics view, AI can pinpoint the critical compositional features that predict a property, and even translate knowledge across materials families. The approach is essentially building a comprehensive data-powered model of materials behavior, analogous to how genomics decodes biology.

Large-scale projects and AI models exemplify this genomics approach. Merchant et al. (2023) used about 48,000 known crystal structures as a training “genome” to predict 2.2 million new candidates. More generally, modern “foundation model” perspectives emphasize the need for large, curated datasets. Pyzer-Knapp et al. (2025) note that materials science is moving toward such models, but also that existing databases like PubChem, ZINC, and ChEMBL (for molecules) have limitations in coverage and licensing. They stress that expanding and standardizing data is key. Meanwhile, NLP-derived embeddings of textual and numerical data have been applied: for instance, dense vector representations of materials (from literature data) have been used to compute material similarities and identify candidates in a “materials genome” space. Together, these efforts show how AI is assembling a data-driven map of materials, uncovering how compositional changes (“genetic” variations) influence properties.
11. Developing Sustainable Materials
AI is accelerating discovery of greener materials by enabling virtual screening of novel polymer and composite chemistries. It predicts properties of candidate materials (e.g. strength, degradability) so researchers can focus on eco-friendly options. For example, deep learning models trained on bioplastic data can identify biodegradable plastic formulations that match conventional plastics in performance. AI also optimizes sustainable building materials (like low-cement concrete) by balancing strength, cost and carbon footprint. In polymer research, AI suggests recyclable or bio-based polymers with targeted properties, reducing trial-and-error synthesis. Overall, AI-driven workflows guide design of sustainable materials with fewer experiments.

Researchers have used multitask deep neural networks to predict the properties (thermal, mechanical, etc.) of polyhydroxyalkanoate (PHA) bioplastic materials, identifying PHA formulations that rival conventional plastics. A Nature Reviews Materials overview notes that AI-augmented design has helped discover polymers for energy storage, separations, and sustainable applications. In concrete science, an ML/optimization study designed optimized low-CO₂ mixes: the AI-designed mix achieved greater than 50 MPa strength with 25% less cement and 15% lower cost. Similarly, Cornell researchers applied graph neural nets to screen polyethylene compositions: ML-guided design of high-density PE (HDPE) variants reduced required material and improved recycled HDPE quality. These AI-driven studies illustrate how predictive models and optimizers streamline the development of eco-friendly polymers, composites, and concretesnature.com .
12. Optimizing Additive Manufacturing Processes
AI enhances 3D printing by selecting optimal process parameters and detecting defects in real time. Machine learning models predict relationships between inputs (laser power, speed, layer thickness) and print quality, enabling automatic fine-tuning. For instance, AI can detect common printing errors and adjust settings mid-print to ensure first-time-right builds. Generative design algorithms guided by AI also propose optimized lattice and part geometries tailored to additive processes. In manufacturing workflows, AI automates routine tasks (slicing, supports) and analyzes data streams (thermal sensors, camera feeds) to improve consistency and speed. Overall, AI-driven process control and design lead to faster, more reliable additive fabrication.

A recent model used deep learning and synthetic X-ray data to spot hidden defects inside metal AM parts. Trained on simulated defect patterns, it correctly identified hundreds of unseen flaws in real 3D-printed components. In powder-bed metal printing, the AIDED framework (U. of Toronto) uses ML and a genetic algorithm to choose laser parameters; it achieved R²≈0.995 predicting melt-pool size and produced parts with >99.9% density. Importantly, AIDED found optimal printing conditions in as little as one hour of computation. Industry surveys note that AI is being used for defect detection, parameter tuning, and even novel material alloy design in AM. These results show AI’s role in tailoring process settings and spotting errors, significantly improving additive manufacturing throughput and quality.
13. Stability and Longevity Predictions
AI models forecast material durability by learning from accelerated tests and simulations. By analyzing microstructural evolution or performance data, ML algorithms can predict when materials will degrade or fail. For example, AI can detect early signs of metal fatigue or corrosion before visible damage. It can forecast coating lifetimes under harsh environments or estimate polymer aging. These predictions let engineers select materials with longer lifespans or design preventive maintenance. In practice, AI uses time-series and imaging data to identify patterns of damage growth. This helps extend component service life in aerospace, automotive, and infrastructure by preemptively addressing weaknesses.

Lehigh University researchers developed a neural-network framework (PAGL) that learned to spot abnormal grain growth in metals under heat. On simulated polycrystalline data, it predicted which grains would grow excessively long before failure. Notably, the model correctly identified impending abnormal grains in 86% of cases within the first 20% of the material’s life. This early warning capability means materials can be engineered to avoid such failure modes. In another example, a physics-informed ML approach screened over a million high-entropy alloy compositions to find ones resistant to corrosion. While not an adhesive study per se, recent work applied ML to predict polymer tensile strength from X-ray diffraction: a model forecasted new polymer mechanical properties (strength, flexibility) with high accuracy, suggesting broad applicability for longevity prediction stam-journal.org . These studies demonstrate that AI can reliably predict degradation phenomena—grain coarsening, corrosion, or mechanical failure—to improve material stabilitystam-journal.org .
14. Catalyst Design for Energy Applications
AI expedites catalyst design by modeling complex surface chemistry. Machine learning can predict catalyst performance (activity, selectivity) from composition and structure, reducing reliance on serendipitous experimentation. For heterogeneous catalysts, explainable AI identifies which elements and structures drive activity, guiding rational design. In energy conversion, AI is used to discover efficient electrode and photocatalyst materials: for example, graph neural networks can screen thousands of alloy compositions for fuel cells or electrolyzers. These AI tools also optimize reaction conditions and predict stability under operating conditions. Ultimately, AI-assisted screening and modeling enable more efficient catalysts for hydrogen production, CO₂ conversion, fuel cells, and more.

A BIFOLD research team (Berlin) introduced an ML framework for data-poor heterogeneous catalysis. Their model handles small experimental datasets to robustly predict catalyst yields and uses explainable techniques to reveal which components most influence performance. In electrochemical catalysis, Korean researchers used a slab graph convolutional neural net (SGCNN) to predict adsorbate binding energies on candidate surfaces. In one study, the AI screened ~3,200 Cu–Au–Pt alloy compositions and identified a new ternary alloy catalyst. Experimental validation showed this alloy was cheaper and >2× as active as pure Pt catalysts. The ML screening took about one day, a task that would have taken years of DFT calculations. These examples highlight AI’s role in quickly narrowing vast chemical spaces for catalysts and finding high-performance energy-conversion materials.
15. Tailoring Electronic and Photonic Materials
AI-driven design is revolutionizing optoelectronic materials. Machine learning models predict electronic band structure, optical properties, and device performance from composition and structure, guiding the discovery of high-efficiency semiconductors and photonic crystals. In photovoltaics, AI has been used with automated experimentation to find new perovskite additives and hole-transport materials. For example, an ML-guided workflow narrowed millions of candidate molecules to just dozens of high-performing compounds, boosting solar cell efficiency by several points. In photonics, AI-assisted inverse-design algorithms enable creation of custom metamaterials and nanoscale optical components. Overall, AI accelerates tuning of bandgaps, refractive indices, and microstructures for LEDs, lasers, and sensors, enabling smarter design of electronic and photonic devices.

A high-throughput AI approach at Karlsruhe Institute of Technology combined machine learning with autonomous synthesis to rapidly find new molecular dopants for perovskite solar cells. From a library of ~1,000,000 candidates, the team tested only 150 selected molecules. This ML-guided selection raised a solar cell’s hole-conducting efficiency from 24.2% to 26.2% (absolute). The “one-synthesis-at-a-time” robotic platform ran 150 experiments instead of the hundreds of thousands otherwise required. These results demonstrate AI’s power to cut discovery time in photovoltaics by orders of magnitude. While specific photonic inverse-design cases are mainly in literature reviews, recent reports emphasize AI’s success in predicting polymer electronic properties (e.g., forecasting polymer strength from XRD patterns stam-journal.org ) and in emerging areas like AI-designed metamaterials. Thus, AI methods are yielding practical advances in designing new semiconductors and optical materials for energy and communications.
16. Lightweight Alloys for Transportation
AI aids the design of high-strength, low-density metal alloys for vehicles and aircraft. By training on databases of alloy compositions and properties, ML models can propose new aluminum, magnesium, or titanium alloys with optimized weight-to-strength ratios. These models rapidly evaluate multicomponent alloy spaces that are intractable by experiments alone. For example, an AI framework for titanium alloys searched combinations of 18 elements to maximize strength; the optimized model achieved ~0.95 R² in property prediction. Such AI-guided discovery is applied to aerospace alloys (e.g. advanced Al and Ti alloys) to meet strict performance targets. In automotive and aeronautics, these lightweight alloys can reduce fuel use and emissions. AI also helps design alloy microstructures (via heat treatment or additive processes) for improved toughness and fatigue resistance.

A recent study presented a computer-aided ML workflow for high-performance titanium alloys. The researchers assembled a bespoke dataset of alloy compositions and properties, trained multiple regression models, and optimized an ML model that reached R²≈0.95 in predicting strength on test data. Using that model, they systematically searched the alloy space and suggested Ti-alloy chemistries expected to have superior strength-to-weight ratios. This demonstrates AI’s ability to expedite alloy design without exhaustive experiments. (No specific examples of auto steels were found in recent open literature.) These approaches allow designers to pre-screen candidate lightweight alloys before fabrication, shortening development cycles in transportation industries.
17. Combinatorial Exploration of Compositions
AI-driven experimentation accelerates “materials by design” by intelligently exploring huge composition spaces. Active-learning loops with robots allow automated synthesis of many variants guided by ML. Instead of exhaustively testing all combinations, AI selects the most promising regions of composition space. For instance, AI can analyze preliminary results and then direct a robot to make the next set of samples (a materials acceleration platform). This synergy is applied across materials – from alloys to chemicals. In practice, AI-guided labs can screen thousands of experiments (inks, alloys, molecules) in days. This dramatically speeds up mapping of phase diagrams and identification of optimal compositions for desired properties.

A team at PNNL and Argonne combined AI with high-throughput experimentation to optimize flow-battery electrolytes. Their active-learning system tested only a small fraction (less than 10%) of possible solvent mixes yet found formulations that tripled the solubility of the redox-active compound. Meanwhile, in catalyst discovery, an AI-driven screen of 3,200 Cu–Au–Pt alloy compositions (for fuel cells) identified a new ternary catalyst, matching more than 2× the performance of Pt with far less computation. In perovskite solar research, an ML-guided search of 1,000,000 molecules required only 150 syntheses to discover additives that boosted device efficiency. These case studies show AI successfully narrowing multi-dimensional spaces: whether electrolyte solvent mixtures, alloy chemistries, or polymer components, AI-directed experiments find winning compositions with a tiny fraction of trials.
18. Real-Time Data Analysis in Experiments
AI enables on-the-fly analysis of experimental data streams, creating closed-loop control in labs. During characterization (e.g. microscopy, spectroscopy, X-ray imaging), AI algorithms can instantly identify features of interest and feed back to the experiment. This allows dynamic adjustment of conditions without human intervention. For example, an autonomous system can monitor spectroscopic signals during thin-film growth and immediately change deposition parameters for better film quality. In manufacturing or testing, real-time AI can flag defects as they appear, enabling immediate corrective action. Overall, AI-integrated instrumentation turns experiments into adaptive processes, significantly speeding up discovery and improving reproducibility.

Oak Ridge National Laboratory demonstrated an autonomous thin-film deposition platform that used in-situ spectroscopy and AI to optimize pulsed laser deposition. The AI analyzed the deposited film’s quality in real time and adjusted the synthesis conditions (temperature, pressure, laser energy) for the next run innovations-report.com . This closed-loop system achieved roughly ten times faster optimization of film properties than manual experiments innovations-report.com . (The published study appeared in Small Methods.) Similar concepts are emerging at synchrotron beamlines and electron microscopes, where AI processes diffraction and imaging data in real time to guide scanning paths. While few formal studies are yet published, the trend is clear: machine learning models are being integrated into measurement tools to perform instant data reduction and decision-making during materials experiments innovations-report.com .
19. Improved Structural Adhesives and Polymers
AI assists in formulating and optimizing advanced adhesives and polymer materials. By relating molecular structure or formulation parameters to macroscopic properties, ML models can predict bond strength, toughness, or elasticity of polymer blends. This reduces trial-and-error in developing high-performance structural adhesives. AI also helps design polymer networks (e.g. epoxies, polyurethanes) with tailored crosslinking for improved durability. For composite adhesives, ML can optimize the ratio of resins, hardeners, and fillers to meet target performance. In essence, data-driven models enable rapid identification of promising adhesive formulations and polymer composites for demanding applications (e.g. aerospace bonding, load-bearing plastics).

While specific recent reports on adhesives are scarce, analogous work exists for polymers. Tamura et al. (2024) developed an ML model that predicts a polymer’s mechanical behavior from its structure. Using X-ray diffraction data of polypropylene samples, their algorithm accurately forecasted tensile strength and flexibility of new polymer formulations stam-journal.org . This demonstrates that AI can replace destructive testing in determining polymer performance. In principle, similar ML approaches can be applied to adhesive formulations to predict joint properties from resin chemistry and filler content. (No recent peer-reviewed studies explicitly on AI-designed adhesives were found.) Such polymer-property models lay groundwork for smarter design of structural adhesives by extrapolating known data to untested mixtures stam-journal.org .
20. Accelerated Battery and Energy Storage Material Development
AI is transforming battery R&D by rapidly identifying new electrode, electrolyte, and electrolyte-additive materials. Machine learning screens chemical databases and suggests novel compounds with high energy and stability. In electrolyte development, AI has identified formulations that boost solubility or conductivity under various conditions. For electrodes, AI models predict voltage, capacity, and lifetime from composition, thus narrowing down millions of candidates to a few promising materials. Autonomous labs (MAPs) use AI to integrate synthesis, testing, and feedback to accelerate optimization of battery formulations. Overall, AI-driven discovery shortens the lead time for advanced batteries (Li-ion, Na-ion, redox flow, etc.), enabling faster design of higher-energy, longer-lasting energy storage systems.

For example, a PNNL/Argonne team applied active learning to discover optimal redox-flow battery electrolytes. The AI-directed experiments led to electrolyte mixtures dissolving three times more active material than standard formulations. In electrode materials, deep learning has screened enormous datasets: Du et al. (2025) reported that by analyzing 1,000,000 candidate molecules from PubChem, their ML pipeline identified 1,524 promising organic compounds for battery electrodes. These and similar studies have guided experimental teams to synthesize only the top candidates instead of brute-forcing through all possibilities. Such AI-accelerated workflows are now routinely used in battery research to discover new cathode and anode chemistries, solid-state electrolytes, and optimized electrolyte additives.