Text Version for the FCIC Webinar: Developing Modeling Tools for the Emerging Biorefinery Industry

Ed Wolfrum, Feedstock-Conversion Interface Consortium Principal Investigator: Good morning everybody. Welcome to the second in a series of webinars featuring work from the Feedstock-Conversion Interface Consortium, the FCIC. My name is Ed Wolfrum; I currently serve as the principal investigator of the FCIC, and we’ll get started here in just about a minute; we still see a number of people still logging on. So please give us one more minute and we’ll get started.

I should mention we are recording this webinar, so this will be available on the FCIC website in a couple of days, but to be clear we are going to record this, so if you do have to drop off early you can catch up where you left off in a couple of days on our website.

So I’m showing 2 hours, excuse me, 2 minutes after the hour, so why don’t we get started? Thanks to everybody for coming; I appreciate you taking an hour out of your day to join us. As I said, my name is Ed Wolfrum; I have the good fortune to serve as the principal investigator of the Feedstock-Conversion Interface Consortium, the FCIC. And I’m really glad you all took the time to join us today.

So, as many of you know, the FCIC is led by the Department of Energy Bioenergy Technologies Office, so BETO; it’s a collaborative effort among researchers from nine different national laboratories to address the unique challenges caused by feedstock and process variability on biorefineries.

In December I presented a webinar that had an overview of the FCIC, real high-level, talking about all the work that we’re doing, and it is available on our website; we’ll provide that link later in the talk. But if, for those of you who can’t wait, you can Google “FCIC BETO” and get right to our website.

So for those of you who did miss that talk, I’m going to spend only a minute talking about the FCIC and then we’ll turn it over to the main event today, which are presentations by two FCIC researchers: Dr. Yidong Xia of Idaho National Laboratory and Dr. Peter Ciesielski of the National Renewable Energy Laboratory, and they’ll be presenting their work on computational and modeling approaches to address the challenges of feedstock variability at multiple scales. It’s going to be very interesting. It’s really interesting work.

Because of the uncertainties associated with virtual presentations like this, we have made a recording of both talks, so we’ll actually be playing videos of those, uh, with a little bit of a break between the two, each talk about 20 minutes or so. Um, but both Peter and Yidong will be available at the end to answer questions.

Next slide, please.

Sorry about this, [inaudible] but the FCIC is a nine-laboratory collaboration. Our key ideas, our focus points, are that biomass properties are both variable and different. They’re variable among different types of biomass materials and are affected by storage, transport, and handling, and they’re different from commodities that have been commercialized very successfully in the past. And empirical approaches to address these issues have been unsuccessful.

So we are developing first-principles-based knowledge and tools to both understand and to mitigate the effects of biomass feedstock and process variability on biorefineries. Our website has a lot more information on that. And I’ll direct you there. But let’s go to the main event, shall we?

Today’s talks—again, Dr. Yidong Xia of Idaho National Lab—will be talking about the multiphysics models his team is working on, and Dr. Peter Ciesielski will be talking about the multiscale modeling work that his team is working on. And so, without further ado, we will get started with the first presentation, and I’ll turn that over to our moderator who is queueing up the first talk. So thanks everybody for your attention and let’s get started.

Yidong Xia, Idaho National Laboratory: Welcome to the FCIC webinar presentation on the modeling tools for the biorefinery industry. My name is Yidong Xia, a computational R&D scientist from Idaho National Laboratory. Currently I am the FCIC material handling task lead.

For today’s webinar I will talk in part one: multiphysics models for biomass preprocessing and the material handling. The research I am going to talk about is coordinated with Dr. Allison Ray, who is a principal investigator for FCIC at INL, and Dr. Vicki Thompson, who is the FCIC preprocessing task lead.

For part two of the webinar, Dr. Peter Ciesielski from NREL will talk of the biomass models at the mesoscopic and atomic scales. Materials handling and the preprocessing are among eight tasks in FCIC, and the likes referred to as an overview of capsizes fractures in his webinar back in December. I would like to use this list of names to acknowledge the many contributors and collaborators I have directly worked with in material handling and the preprocessing tasks, these scientists, engineers, and students at national labs and three universities. These two tasks are also published in BETO’s Consortium for Computational Physics and Chemistry. FCIC modelers keep communications with CCPC as well as eight national labs as well.

To start with, I’d like to briefly explain preprocessing and the material handling. Material handling operations refers to the feedstock feeding and the transport—essentially from preprocessing to conversion in a biorefinery—and the preprocessing operations refer to mainly feedstock size reduction, separation, and all other necessary means to ensure the feedstocks have the required material properties for conversion. No matter what we do, our ultimate goal is to lower the cost of preprocessing and handling and to make biomass fuels more competitive.

I always feel it’s necessary to show people the whole process demonstration unit in order to explain the possible issues associated with preprocessing and the material handling. A PDU has a number of preprocessing units such as grinders and handling units such as conveyors and hoppers. Process upsets can occur anytime because of issues from feeder stock material attributes and the processing parameters.

Production-scale conversion of biomass in a biorefinery has remained limited. Milling and handling have been prone to process upsets such as jamming and clogging, resulting in increased downtime and high costs. A primary challenge in the design of a biorefinery is the storage, transport, and reactor feeding of the biomass feedstocks; increasing systemwide performance efficiency and enabling access to biomass feeder stocks are all potential benefits to the techno-economic analysis and the life cycle assessment.

Many factors would cause process upsets. In this example, we use simulations to show overfeeding in the hopper can result in jamming. The purpose of this page is to give you a close impression in how modeling tools will assist or even extend the quality-by-design flows that FCIC has adopted. Process modeling, to its maximum capacity, is to enable fast and massive testing and the real-time performance diagnosis. Being able to use models to predict operation performance by testing over the many material attributes and the process parameters with our cost study to prepare and repeat is the goal of our model divestment.

We would like to highlight the factors and the effort to develop experimental validated models is of various significance, as the models must go through rigorous calibration and validation of the many material attributes and the process parameters.

Granular flow models have been successfully applied for granular materials such as pharmaceutical and agricultural products, where the particles manifest relatively uniform material attributes such as particle shapes, size distributions, and material properties. Biomass particles, however, have very irregular shapes and variability in material properties. The major challenges to develop the flow models for biomass include: how to formulate constituted models to capture the complex behaviors, how to account for large variability of this material attributes in the models, and how to link the material attributes to the model parameters.

Our research first broke ground by identifying the lack of flow models suitable for biomass, the lack of experimental data for supporting model development, and the lack of open-source model platforms for user coverage.

In the two articles published last year, we described the limitations of existing models as a knowledge base, recommend the potential flow models, and the cost for biomass materials demonstrated early progress for proof of best practices. Our research outcome will help stakeholders in the bioenergy industry and academia avoid the investment in modern techniques not suitable for biomass.

Our research goal is to develop experimental validated models to achieve a design of flow equipment that is free of process upsets. We have developed a novel multiscale flow experiment to identify critical material attributes and critical process parameters and establish their correlation to equipment performance. Experimentally validated particle models and the bulk flow models are developed to investigate operation conditions that are challenging to measure in experiment. Material handling does handshaking with preprocessing and the conversion, so quality attributes from preprocessing are the material attributes and processing parameters for material handling and the according attributes of material handling will be the material attribute and the process parameters for conversion. Primary communication in the CCPC consortium also gives a chance for other models to review and comment on our approach.

Operational reliability is how to make metrics to evaluate our task. This refers to a design chart for consistent flow of the hopper at the design flow rate for model validation, which is defined as a fidelity of models to be over 80% agreement with an experiment. Here is an example diagram shows the critical material attributes, critical process parameters, and critical quality attributes for hopper flow design. We use our experimental and modeling methods to determine the criticality of material attributes and the parameters and establish the correlation to the discharge flow rate.

For this page, I will go over a list of accomplishments and the work and progress in a materials handling task. They include discrete particle models for biomass flow first, and next the bulk flow models.

Today, the spherical particles are still the most widely used DEM model in industry for process modeling. It is fast in computing and easy to scale for supercomputers. However, the theory cannot meaningfully affect irregular particle shapes. One way to improve is to use so-called clumped-sphere model for assembling spheres to represent complex particle shapes. More advanced shape models include polyhedron, fractal stream, fractal shell, and other special shapes. Each shape model has their pros and cons. We need to choose them for their best applications.

In our early research we used the clumped-sphere models, including the multi-sphere and the bonded-sphere models. They were a suitable starting point for modeling irregular biomass particles. We use these models for modeling pine particles in half flow and switchgrass particles in screw conveyor feeding and generated two articles. However, these models are very expensive to represent complex shapes by using accessible numbers of spheres to assemble every single particle, so we looked for other options while keeping these models in our pockets for future use.

Before we started to look for better options, we took a step back by studying the shape specification for biomass particles, starting from the main point particles. By using high-resolution X-ray CT, we have been able to make accurate digital representations of the pine particles and develop a strategy of how to coarse-grain their shapes for different research purposes.

Here is a close look at individual pine particles at a 15-micrometer resolution and extreme resolution. For DEM models, we require a coarse-grained particle representation. At coarse-grained we can get a so-called high-resolution particle. Each of such a particle consists of a few hundred of surface triangles. At this level we can afford bulk formulations with a few thousand particles, which match the number of particles in the laboratory-scale actual loader and the shear tester.

We develop an X-ray CT-informed compass-shaped particle modeling approach for the fractured pine particles. Using this approach, we perform an exhaustive study of the influence of the pine particle materials attributes such as particle shapes, size, and stiffness in stress consolidation. We apply the calibrated model to a lab-scale friction test and it found a reasonable agreement with the experiment. The most important message is that particle morphology must be adequately represented to achieve good model fidelity.

More recently, we have developed a further coarse-grained particle model to lower the cost of biomass particle simulations. Now, by using only spherical particles we compensate for the loss of complex particle shapes by a more sophisticated particle contact model. The new contact laws are available in our open-source packages.

DOE supercomputers also enable fast determination of particle model parameters with 100,000 simulations in parallel.

Particle-based biomass flow models have practical limitations. For industrial-scale products, HPC can help, but not everyone has access to HPC. We have to develop a continuum mechanics-based model specifically to address the scale of challenges. The trade-off is between the detail and the scale. To study this, we suggest the particle models to understand the particle dynamics for equipment design but for a model is a choice because it costs you much less computing time. It is not a concern if you do not have access to HPC; most of our simulations can be completed on a good workstation within a reasonable amount of time.

The development of continuum mechanics-based models for biomass flow is challenged from a different perspective. At early times of our research and even now people, use is the Mohr-Coulomb constitutive model for the bulk rheology. The MC model has a few critical shortcomings. To name a few, it does not have the density and the pressure dependent on any elasticity and dilation-enabling softening. So it cannot capture the rheological behavior of a biomass. The hypo-plastic model that our finite element expert Dr. Wencheng Jin introduced for biomass flow have been for us most of the shortcomings of the MC model. I will expand this work from here.

Wencheng developed and validated the advanced continuum particle flow theories and models to predict the biomass in storage and the shear flow conditions. Judging from the agreement with the axial loading and shear experimental data, the model represents the state of the particle biomass flow so far. This research is published in a recent journal article. This new model is open-sourced as user modules for Abacus. In the meantime, the model is being implemented in the full open-source package by our NREL task members.

More recently, Wencheng put the bulk flow model to the real task by predicting pilot-scale hopper flow and screw feeding scenarios. The flow models show the reasonable agreement with the experiment. The capacity of the model to simulate 3D screw feeding using FEM is the first in its kind. This example shows our flow models are starting to facilitate the design of operation units. Some research is published in a recent journal article, too.

In the meantime, Dr. Jonathan Steagal’s group in NREL have developed a CFD model for milled biomass flow in compression-screw feeders. This experiment-based model helps resolve the deformation behavior of the biomass material. The simulated torque of wood chips or corn stover being compressed and transported through pipes or feeders of greater quantity with the experiment. The model is open-sourced for public use and through CRADA projects between industry and labs to enable simulations of HPC.

Dr. Jonathan Steagal’s group in NREL also developed a 1D convection and heat transfer model for predicting the temperature of biomass in pyrolysis through auger feeders. Elevated temperatures can result in premature reaction and plugging of the feeder, as has been observed in NREL’s pyrolysis reactor. The 1D model provided a relatively low-fidelity but computationally efficient prediction of the biomass temperature in the feeding system for quick system diagnosis. The model is currently available to lab engineers and will be released for general access later.

From this page, I’ll go over a list of accomplishments and work in progress in the preprocessing task. They include the preprocessing models from pilot scale and then down to the microscale. To align with the interest of our research engineers, we have developed a knife-milling model that predicts feedstock deconstruction. This is an ongoing effort with experiment data to be produced for—sorry—for the model calibration.

Some models are developed to support the quality-by-design approach for both handling the many operational units for biomass and other feedstocks. They facilitate fast testing over quite a few design parameters such as the rotor speed, blade angle, and input match rate, and their correlation to output particle size distribution.

The size-reduced feedstock particles will need to sieved. Air classification of such a technique to accelerate this process. Again, to align with our research engineers’ needs, we also developed an air classification model using a DEMCFD method for modeling particle airflow dynamics. Simulation results show reasonable agreement with the experiment. This modeling tool can have a fast testing and real-time diagnosis for the processing parameters such as blower speed, feeding mass rate.

Fracture mechanisms for pine particles during milling are not known clearly and vary depending upon orientation of the grain to the impact. Our research engineers perform a single particle test to quantify the impact of material attributes on forced and fracture mechanisms needed to break the particles. The data were utilized to guide the development of bonded-sphere DEM model to predict the milling performance. Biorefinery operators can reduce a row of sum empirical data lines, so knowledge gained in our research may help aid us for examining the process conditions to optimize their milling operations. In order to further investigate the fracture mechanism of the pine particles, we use the high-resolution 3D X-ray CT to reconstruct the microstructure of the pine particles. The digital structure provides for geometry data for developing a micromechanics DEM model. This intermediate-scale model will make it possible for upscaling for areas atomic and mesoscopic. These mesoscopic models, for example Peter’s models.

In the meantime, we have developed micro-porosity analysis method for pine. This shows the highly heterogenous pore distribution in the pine particles, indicating the complexity of fracture mechanism in the material.

Those computation models and tools together constitute a so-called virtual laboratory for preprocessing. They span microscale, laboratory scale, and eventually pilot scale.

We are also currently exploring and testing novel computational models for emerging needs, for example, the next phase of FCIC preprocessing.

To summarize, the goal of our modeling work is to provide the biomass industry with predictive design tools for the design of a preprocessing and handling documents. The tools include safe working envelope of critical material attributes and critical process parameters and open-source biomass modeling software.

We use important videos to disseminate our research. For example, open-source software serves education and in return, the universities contribute to software development. Open-source software demonstrates technology more easily to industry and industry becomes more willing to collaborate.

That’s all I have to share with you today, and I thank you very much for your time participating in this webinar. Thank you again.

Ed: Thanks, Yidong. Just a couple seconds while we get Peter Ciesielski’s talk queued up. I did forget to mention if anybody has any comments or questions, please use the chat box. We’ll collect all those and we will have—we’ll pitch the question to Yidong and Peter after the second presentation.

Peter Ciesielski, National Renewable Energy Laboratory: Hello, I’m Peter Ciesielski from the National Renewable Energy Lab. My presentation today will focus on recent developments in the ongoing modeling and simulation projects within the FCIC and CCPC. The two main themes of my talk will be propagating important complexities throughout length scales and biomass, and building modeling tools to transform fundamental scientific knowledge into actionable information.

I think it’s important to begin this talk with a very quick overview of the hierarchical structure of biomass. I know many of you are quite familiar with this topic, so forgive me for the brief review, but for those who don’t know much about this, these are some really foundational concepts to the problems that we are trying to address.

The properties and behavior of biomass is controlled by a combination of structural features that manifest at many different length scales. For example, as shown in panel B, the bulk density mechanical properties of biomass is really dictated by the tissue structure, which is composed of assemblies of cell walls and cell lubing.

So the function of those structures shown in panel C is not only to provide structural support and mechanical integrity, but also to transport water nutrients throughout the organism. So the mechanical properties of those cell walls is essentially dictated by a very impressive biopolymer composite that we call lignocellulose. So this material is essentially scaffolded by cellulose nanofibrils, shown in green, which are large agglomerates of cell decks and chains that coalesce into nanostructures. That is crosslinked with hemicellulose and other polysaccharides and then the material is backfilled with the aromatic polymer called lignin, which is essentially an amorphous polymer of phenylpropanoid subunits.

So it’s really capturing these emergent properties that presents a big challenge, not only to the modeling but to experimental efforts as well. Another major challenge to understanding is predicting the behavior of biomass and its inherent variability. So over the next few slides I’d like to present some examples that demonstrate how biomass can vary at different length scales, where that variability can originate from, and what type of functional impact it can have.

So on this side we’re looking at some X-ray tomography reconstructions of biomass particles of different species, and you can see it’s really just tissue structure that dictates not only the bulk density but also the mechanical properties of the permeability and also thermal properties of these materials. And so variations in all of those properties will cause these particles to behave differently, for example mechanical combination, or thermochemical processing, and these all originate inherently from different types of biomass species.

So this next example brings us down to the scale of individual cell walls, and here we’re looking at two TEM micrographs, and these are not of different species, in fact they’re from the very same plant, and in fact they’re only maybe about 50 microns apart from each other on the same TEM grid. But they are from very different tissue domains.

So shown on the left here are sclerenchyma cell walls from vascular tissue. When they’re very thick they provide a lot of mechanical support. On the right are parenchyma cell walls, which are much thinner and aid in water retention and serve some other functions that are not necessarily structural. So not only are these different in their architecture but they’re also very different compositionally. The sclerenchyma cell wall is heavily liquified; the parenchyma cell walls actually have very little lignin. And so if you were to subject these two different cell walls to cellulose enzymes, the enzymes would go to town on the parenchyma material and really digest it quickly, where they would struggle to make a dent in the sclerenchyma cell walls.

So here’s another really interesting example of variability. And again, we’re at the cell wall scale. So once again we’re in the same species; this is all Arabidopsis, but these three are genetic variants that were produced by our collaborator Clint Chapple from Purdue University and these—its mutants have different lignin variant—different monomer compositions.

So the wild-type material has approximately 70% S lignin to 30% G lignin; if you need a reminder for what those different lignin monomeric subunit chemical drawings on the right. The middle sample here is 100% G lignin and the sample on the right has 100 S lignin. So maybe there are some structural differences observable here, but it wasn’t really too outstanding until we exposed these things to an acidic pretreatment. We observed some mild deconstruction in the wild type; actually it was even more mild in the G. But the material with pure S lignin just literally came unglued so that it just fell apart at the middle labella, and there’s this fibrillation; you can see the cellulose fibrils spirals on the edges of the cell walls. So it was heavily deconstructed by the same acid pretreatment but didn’t do a whole lot to the other two barriers. And so this resulted in a much higher biochemical reactivity of the pure S material after this pretreatment.

So this result at the time, these observations essentially were unpredicted. They’re still difficult to interpret, in fact, folks are still arguing about this in the literature, and we took a stab at it in this manuscript, and we probably weren’t entirely right. But what this did do—I guess the point is that this really revealed some gaps in our holistic understanding of where and how variability can manifest in biomass and what the impacts are.

So these questions and knowledge gaps I guess served as motivation for us and others to develop models that articulate scientific knowledge about biomass into a functional framework; again, provide some sort of interpretation for results like this and also guidance for process development.

So next I would just like to cover a little bit the evolution of models for the plant cell wall. This is an earlier one that I found in the literature, and there are some much earlier than this, but what I like about this is it’s a great example of a qualitative model. So what this does is provide a conceptual framework that basically describes a combination of observations from experience that attempt to describe not only the architecture of the wall at multiple scales here—notice there’s nanoscale features and the cellulose fibral and molecular features that speak to the composition—and perhaps how the different components are connected to one another.

So these models have certainly improved with time and analytical methods have become more accurate. So this model has been one of the most famous plant cell models that was published by Chris Somerville, and it includes a lot more molecular detail about the different types of polysaccharides that are present in the cell wall, how they might be arranged.

It also includes some information that was really derived from how cellulose is synthesized, what is known about that process, which provides insight into the structure of elementary cellulose fibrils.

Here’s another modeling, more recent, that was published by Dan Cosgrove. This model contains more information about maybe the macromolecular structure of some of the hemicelluloses and the pectins and xyloglucans, for example. This also has some pretty specific and intentional architectural features here about the bundling of cellulose fibrils into these macrofibrils and how hemicellulose can be attractive between them in certain cases.

This brings us to an even more recent model that was published by Kang et al. in Nature Communications in the United Kingdom, which has a lot of high-impact publications, so there’s a lot of folks interested in understanding the structure of plant cell walls.

So this model has some pretty important features, one of which is that lignin is rarely in direct contact with the surface of cellulose and these observations were derived from NMR. It also speaks more to the specific conductivity of lignin and hemicellulose, but it also speaks to the macromolecular structure in xylan that’s close to cellulose versus far away from cellulose. So a lot of new information in this model.

And so while these are insightful—all the models I’ve shown you are insightful framework to understand maybe how biomass is configured—these are still mostly, at the end of the day, PowerPoint drawings. There’s no quantitative, or very little quantitative information. There’s not molecular binding energies. There’s not mechanical properties, there’s not deducivities, right? If you’re trying to optimize the process here it’s—if you’re trying to design a metamaterial, or design a reactor, the languages of design and optimization are articulated in numbers, and those are really absent from these types of qualitative conceptual models.

So that was a lot of background information and motivation, but it does bring us to the objective of these modeling efforts that I want to talk about within the FCIC, and that is to develop a computational framework that essentially captures state-of-the-art scientific knowledge about plant cell walls and lignocellulose in general into a functional computational model that we can ask questions of and we can receive answers. And so our approach to this is we’re going to combine multiscale modeling; it’s going to span quantum mechanics, molecular dynamics. We have many opportunities for experimental validation, looking at compositional changes as a result of different processes, measuring the mechanical properties and in collaboration with other experimental groups across the FCIC. And the hope is that we can articulate multiscale architecture of biomass and we can articulate different processes that may undergo like chemical gradients, temperature gradients, mechanical loading, and so forth. And we can model a lot of different types of biomass. We can change the architecture; we can change the composition to approximate different feedstocks that are relevant to our industrial stakeholders, or we can even predict maybe what would an ideal, genetically modified biomass look like for a given process around tissue.

So now I’d like to provide some specifics of how we’re approaching this, and so the nice high-level overview is over and now we’ll jump into the weeds a little bit. So before we can begin to model the entire lignocellulose assembly we need to understand the components. So what we’re going to start with is a model that can capture the mechanical behavior of cellulose. These are mechanical processing applications, right, because this task is focused on preprocessing, so mechanical combination and handling and so forth.

So the question we’re asking here with quantum mechanics is how far can you stretch a glycosidic-bonded cellulose before it breaks. So we did this simulation using quantum mechanics, and essentially what we’re looking for is when the quantum potential deviates from the classical potential here, which is shown here on the top right. And that’s when there’s just not enough electron density in the glycosidic bonds to constitute a functional covalent bond anymore. And that happens when the C1-O4 bond length reaches about 1.5 angstroms. So this is a nice—a quantitative value that’s maybe not apparently meaningful right now, but it’s something that we can build on.

So now that we have that quantum mechanical information, we can incorporate that into a much larger molecular dynamic simulation, and what I’m going to show on this slide is a simulation of tensile failure of an 18-chain cellulose nanofibril. So what we’re doing is pulling on each end and then testing the bond length. And each time where the bond length exceeds that critical value it breaks. And that’s visualized by one of these glowing orbs. And we’ll see as we continue to pull on this fibril it continues to break bonds and then eventually the entire thing snaps. And at this point this is called the ultimate tensile strength of this cellulose fibril.

I’m going to let this movie play one more time because I think there’s a couple other interesting features that I want to point out. One is that as bonds begin to break kind of in the same location, some oligomers are generated. And then even as additional bonds break, they even generate some monomers and dimers.

I think that’s interesting because it suggests that just this mechanical treatment of this cellulose fibril generates some sugar fragments that are readily fermentable before enzymatic hydrolysis even takes place.

So those molecular dynamics movies are fun and pretty, and I hope you all enjoyed watching it, but really what we’re after here is quantitative analysis. And so what we did essentially is we performed a tensile test in-silico. And so a materials scientist will recognize the stress-strain curve; I pulled this right off of Wikipedia. And this is the type of behavior we would expect from an experiment. So if we plot the stress versus strain that we calculated from the simulation I showed on the previous slide, we see very similar behavior which is great. I want to caveat these results with—they’re brand new, just did this last week—so I don’t have supreme confidence until I’ve done this a few more times under a few different conditions.

But at least the behavior of this looks realistic. I can’t provide, with confidence, a quantitative answer at this point to the question I put up there, but I can say these results indicate that it does take much less force to break an 18-chain cellulose nanofibril than it does with bulk cellulose. And those values have been measured experimentally. And so we at this point are attributing that observation to the effect that there’s so many surface chains in an 18-chain cellulose fibril and they just don’t have the benefit of the stabilizing hydrogen bonding that all the chains have in a bulk cellulose assembly.

So now we’ve got a reasonably good handle on the mechanical behavior of cellulose. Now as we move toward macromolecular assembly, we need to start looking at how polymers interact with one another. So this is a study where we’re looking at interactions between cellulose and xylan, and actually we’ve done a lot in this space; I don’t have enough time to talk about it all today, but here the variability that we’re looking at is a degree of polymerization of the different xylan chains.

So we can ask that question: How does the molecular structure of hemicellulose—in this case the molecular feature we’re interested in is its degree of polymerization—how does that impact its binding propensity to cellulose? And what we found is pretty straightforward. We found that longer chains tend to bind much more strongly to cellulose. And yet shorter chains are actually pretty soluble in an aqueous environment; they tend to actually come off. And so quantitatively, what we can say is that xylan with the degree of polymerization less than 32 will readily come off and become soluble in aqueous environment, and chains that are longer just tend to bind pretty much irreversibly.

And so now we have some actionable insights. So if we want to completely separate out xylan from cellulose effectively, you really need to reduce that degree of polymerization in xylan, either enzymatically or by some chemical treatment to below about 32 if you want to get it off the surface of cellulose. And there are processes that are looking to do this type of fractionation.

So now it’s time to incorporate lignin into our assembly. And so the nanostructural template that we’re using is based on the NMR findings from this paper that I mentioned earlier where we have a core of cellulose elementary fibrils; those are enclosed in hemicellulose and then lignin’s on the outside, but that’s rarely interacting directly with cellulose as per NMR observations.

A few other I guess notable features coming from the literature is that we have 18 chains of elementary fibril, and this came from the latest/greatest study of how cellulose is synthesized in the enzyme assembly that’s responsible for that, which was a fascinating finding on its own. So then once we have this assembly built, we hydrate it to state that’s realistic and then add ions to the assembly can make it charge-neutral.

So now we build this beautiful assembly of lignin cellulose polymers in all of is molecular detail and scientific glory. We proceed to rip it apart in silico because, at least for this application, we’re studying biomass deconstruction. We’re interested in how much force it takes. So there’s so many questions we can ask of this type of simulation. I think one of the most interesting is from a fundamental level, what is the least amount of force or energy that it takes to mechanically deconstruct biomass? And if we understand that number from a fundamental level, it may provide insight as to how close our current mechanical combination processes are and how much room there is for improvement.

Ed: I think we might have a technical issue.

Peter: One of the important features of this modeling framework that was done from the beginning was we want to retain that flexibility to be able to change the molecular composition pretty easily. And so in this study, we varied the composition of lignin monomers into lignin polymers. And what we found was pretty interesting: as you increase the number of the S subunits in the lignin polymer, it becomes much more difficult to deconstruct this material, taking more force to pull it apart.

And so I think that’s a pretty interesting fundamental finding. And so this tool could really help design, from a molecular perspective, if you have genetic modification tools, which we do have now, you could really tune the mechanical properties of the lignocellulose assembly for various materials applications, or you could probably tune it to make it more amenable to mechanical deconstruction.

So really quickly I just want to show some results from some compression stress simulations, using the same system. These simulations are ongoing, so I don't want to say too much more about it, other than we are working our capabilities to look at different modes of mechanical deformation.

So we’ve made a lot of progress with molecular simulations of lignocellulose, but this slide is just to highlight the importance that we need to really scale this up. Coarse-graining is critical if we want to get anywhere close to the length scales that Yidong was talking about in his presentation. So this whole assembly is roughly 20 nanometers, and as you look at that in the context of a plant cell wall it’s barely one pixel in this TEM image.

So our coarse-graining strategy here is to essentially divide our biopolymers into assemblies of atoms and then perform classic bond desimulations and look at the interaction energies between those assemblies. And so our different coarse-grained units are shown here. And then what we do is we perform the MC simulations and those result interaction energies are plotted as the black dots. And then we fit our coarse-grained potential, which is the red line going through each of those plots.

And so we essentially have to do every type of unit with itself and every other type of unit, so there’s a whole lot of combinations that we need to do a lot of simulations. I don’t have time to go into detail here, but one finding I do think is very robust that came out of this is that S lignin subunits, so that’s the subunit with two methoxy groups, interacts more strongly with every other type of biopolymer unit. And so what this tells you is that S lignin is really sticky. And so I think this has more implications for fractionation and material design than probably any mechanical type of processing.

So I do want to mention it’s also critical that we accurately know how to represent the stiffness of cellulose fibrils because they do really provide the scaffolding for lignocellulose. And so in order to do that we once again prepared atomistic simulations here to coarse-grained simulations and these coarse-grained simulations are very fast, so we can just do parameter sweeps over the angle turn that would dictate that stiffness, and we’re able to identify what the right parameter value is that allows the coarse-grained simulation to replicate the behavior of the atomistic one.

So now finally, after a lot of our work, we have arrived at this new coarse-grained model for lignocellulose. So we did a very nice, accurate coarse-graining that will allow us to perform hundreds, if not potentially thousands of times larger than what we can do with atomistic simulations. It’s procedurally generated, meaning that it places density for cellulose, hemicellulose, lignin, even board space based on rules that we derive from the experimental studies that I’ve mentioned. It retains this ability to vary the biopolymer composition, and it’s built in CHARMM and we are going to be ecstatic to release this thing as open-source once our publication is out. We’ve done our simulations, so now we can do much larger simulations looking at compression stress.

So the coarse-grained system, although it’s still in development, has produced pretty interesting results, even so far as—so these kink defects that our simulation predicts will form in cellulose fibrils.

In a recent study we found that those preferential sites for enzymatic hydrolysis are actually macromolecular defect sites that are preferentially attacked by cellulase enzymes. And so studies that we’ve done in the past imaging a surfaces of materials that have been subjected to different degrees of mechanical processing. The ones that have this really kinky-looking surface are incredibly digestible. At the time we thought it was just a surface area thing, a porosity thing. And now we know it’s actually these macromolecular defects that are changing the biochemical reactivity of the material before it even sees enzymes or thermochemical treatment.

So I want to describe where the handoff is going to be in terms of what I’ve been presenting today and this stuff that Yidong presented earlier. And we’re almost there, so our coarse-grained model, which incorporates information all the way down from the atomistic and molecular level up to these large nanoscale assemblies. Those will essentially define how these will interact in the lattice models that Yidong is developing, which are developed from XCT data. And so once we connect the dots here, we will have multiscale modeling framework for lignocellulose that incorporates all of these different features, starting from the molecular level—individual biopolymer bonds, nanoscale assemblies, tissue-scale structures, and bulk mechanical properties.

So in summary, I will try to convince you that we’ve made some good process towards translating a fundamental scientific knowledge about the plant cell wall into actual information and we’re addressing important capability gaps such as lack of integration between scales and the ability of models to address inherent variability. I just want to remind you this is a huge problem; it is a work in progress. But we seem to be generating valuable insight every step of the way.

And then if you look over here on the right, these various models of lignocellulose that have evolved over the last few decades, I hope that it’s evident the FCIC and CCPC modeling efforts are advancing the state of the art and really moving the needle here in this space.

Of course I need to acknowledge the people who are doing the work. I work with a team of absolutely brilliant people. They are the best in the business as far as I’m concerned, and of course we’re very grateful for support from BETO through the FCIC and CCPC, and I’d be happy to take any questions.

Ed: Super. Rachel, can we move to the next slide? Or maybe I’ve got that. I’m sorry, I have that.

So thanks everybody for your attention. I want to say a couple quick things and then maybe we’ll move on to questions. That was a lot of information with a lot of technical details that we’ve shared today. We’ll certainly—looking at the chat I don’t see any questions, so if people do want to either break in or put a question in the chat, we’ll answer them now. But there are other ways to get some more information.

First of all, checking out our website, shown here, kind of a long title, but again, Google “FCIC BETO” and we’re the first hit on Google. On our website you’ll see details of all this work, including a year-in-review report for 2020. We summarized a lot of the work, not only in this task but all across the FCIC. And you can download that, the PDF. Read it at your leisure. Also a list of all of our publications is available on that website. Or you can drop us an email and we will connect you with the people who are most able to answer your questions.

So I do see one question coming in. We’ve got a couple of minutes. Yidong, we have a question for you: What software are you using to implement your models? And then a specific question on how the FEM model was applied for hopper flow simulation. Is the biomass treated as a continuous fluid or is it treated as particles? So software and modeling on 21.

Yidong: Yes, I see a lot of questions here. So first let me reply to the first question. We have been developing our custom DEM model based on the open-source LIGGGHTS DEM package. And now we have labeled as LIGGGHTS-INL available  on GitHub under the group account “idaholab”. So under “idaholab” you will find a repository called LIGGGHTS-INL. So it’s just been released as an open-source, and we are gradually adding more test cases. Or another example of test cases is for user to get familiar with the concept.

So the next one is 21, slide 21. It's not clear to me how—okay, so the FEM for the hopper flow of bulk-scale flows have been implemented as ALE (Arbitrary Lagrangian Eulerian) framework in the Abaqus software, and currently the same methodology has been attempted for implementation in OpenFOAM, open-source framework by Jonathan Steagal’s group at NREL. So we have like two version of the same software—same model working for our problems.

And the third question is [inaudible]. So yes, so that’s a very good question. So if we give a lot of refinement for the polyhedron particle, it might be just a few hundreds of particles. But if we use the coarse grained, we might go tens of thousands. And of course for the DEM simulation another factor is the allowable time step size. If we are using the high stiffness for those particles the concept can be as low as like 0.1 microsecond, and at that level actually you do not have much long-time simulation to achieve, even though you have been using power of computing. That’s the short answer for those questions, and if you are interested to discuss more, I think, please feel free to reach out to us.

Ed: Thanks, Yidong.

Another question on the resolution on the XCT particle models. What viable if you factor in HPC time?

Yidong: Is that a question for me or Peter.

Peter: I think are you talking about the XC- inspired models to downstream conversions? Yeah, so that—so we've actually done a lot in that space; I didn't talk about it much here. I was more focused on mechanical treatments. But we published some pyrolysis modeling, and these actually don’t exactly have XCT but they do have resolved microstructure, again from procedurally generated models. We’ve done thermochemical conversion. Our most recent publication in that space was actually a lignin extraction, so it was the supercritical methanol lignin extraction, and it was a reaction diffusion model. So yeah, the answer is yes, absolutely we are leveraging those for conversion process models as well.

Ed: We have one—I guess we’re just up on the hour, but any other questions? I think we’ve answered everything, but we will reach out and anyone who puts something in the chat make sure that we've answered to your satisfaction and we’ll include email contact for both of the presenters in those contacts. You can count on us to do that.

Are there any other questions? Anybody want to shout something out? I think we’ve got one more minute. If not, if there are more questions? I’m sure our folks will stay on, I would hope.

Okay, we’re going once, and twice, and okay, great. Well—thank you all very much for your attendance. I know everybody is busy—

I’d like to thank—I’ll turn off my video, I guess—Peter and Yidong for the presentations. Again, they will be on our website within a couple of days. Thanks for your time and please reach out with any additional questions, reach out with any comments—hope you all have a safe and productive day and rest of the week. Thanks everybody.

[End of Audio]