Developing First-Principles Tools and Knowledge for the Emerging Biorefinery Industry: Text Version

Ed Wolfrum, Feedstock-Conversion Interface Consortium Principal Investigator: So, welcome, everybody. Thanks for spending some time with us today. This will be the first in a series of webinars on the FCIC, or the Feedstock-Conversion Interface Consortium. The goal of the webinars will be to share the work of the FCIC with our stakeholders and to get some feedback on how we can improve. So the FCIC is led by the Department of Energy’s Bioenergy Technologies Office, BETO. We are a collaborative effort among researchers from nine different national labs trying to address the challenges posed by feedstock and process variability on biorefineries. As I said earlier, my name is Ed Wolfrum, and I serve as the principal investigator of the FCIC.

Today I’m going to provide an overview of the FCIC. We’ll talk a little bit about the motivation for FCIC’s formation and then discuss our overall research approach. After that, I’ll spend some time sharing highlights from the researchers who are part of the FCIC. This’ll actually be the major part of the overall webinar. Finally, we’ll have some time at the end to answer questions. As I said, you’ll have a chance to use the chat box to put questions in, and my colleague Amy Sluiter, who serves as the FCIC project manager, will be monitoring that chat box. As I said, at the end of the presentation we’ll have time to answer some of those questions, but we’re also going to ask you some questions. We’d like to get some feedback from our stakeholders. So with all that done, let’s get started here.

So what is the FCIC, in one slide? The Feedstock-Conversion Interface Consortium is led by DOE as a collaborative effort among researchers from nine different national laboratories. We’re focused on two key ideas. First one: biomass feedstock properties are variable and different. They’re variable because they change over time and space. The obvious example here is the bale of corn stover shown on the right-hand side. The properties of that bale will depend on how it’s harvested, how it’s stored, how it’s transported, and how it’s processed. And biomass properties are different. They’re different from the agricultural commodities that we’re all familiar with that we’ve had a hundred years to work on—corn and grain there being the obvious example.

We argue that empirical approaches to address these two issues haven’t been entirely successful in the past, so the FCIC’s goal is to develop first-principles-based knowledge and tools to understand and then to mitigate the effects of biomass feedstock and process variability on biorefineries. One of the key drivers for creating the FCIC has been the problems that advanced biorefineries have experienced starting up. There is a workshop report shown on this screen, but BETO organized this workshop several years ago to invite stakeholders from the national laboratory system, from academics, and from industry to talk about those issues. One of the activities during that two-day workshop was a survey to look at key operational challenges in these biorefineries.

And the data shown in this little bar chart—three of the top five problems were directly related to feedstock variability. And if you argue that the lack of experience working this feedstock is part of that variability, the top five problems all had to do with variability. So the FCIC got creative, and we have one organizing principle, and I want to spend a little time talking about quality by design, or QbD. Many of you folks, if you have got a background in pharmaceuticals, will be very familiar with this. It’s a well-accepted approach in pharmaceutical development, endorsed by the FDA, and it operates under a basic chemical engineering principle (and I should point out I am a chemical engineer by training): any chemical process is a series of specific unit operations, specific process steps, like a reactor, like a filtration step.

And these specific unit operations are arranged in a specific order to take raw materials on the way in to make products on the way out. And each of these unit operations, as shown in the graphic here on this picture—we have input materials, we have output materials, and that unit operation gets operated in a certain manner. Now, the attributes of the input materials are called critical material attributes. These are the things that we need to measure about the material coming in. The way that unit operation is run, is operated, are the critical process parameters. Think of a temperature in a chemical reactor. Think of an upstream pressure on a filtration system. And collectively, the critical material attributes and the critical process parameters give you a material out of that unit operation with specific attributes, critical quality attributes.

The key feature here is that these unit operations are in a series, so upstream unit operations, the critical quality attributes coming out of an upstream unit operation, are then the critical material attributes for downstream unit operations. But overall, one of the most important issues with QbD, or the advantages, is that we’re moving some from feedstock names to feedstock attributes. We all know that switchgrass and miscanthus, for example, are different from corn stover. The physicist Richard Feynman once said, “I can tell you the name of something, but I still haven’t told you anything about it.” So for the rest of this talk, we’ll be talking about those attributes, as opposed to feedstock names.

So that was kind of a theoretical perspective on that. I’d like to pick a specific example many of you might be familiar with—low-temperature conversion, where biomass is deconstructed to make a cellulose and sugar stream, and then built back up to make a fuel or product. So that’s what’s shown in this diagram. It’s actually my background, so that’s kind of why I’m using it. Let’s look at this fermentation unit operation in the little red box here. It has critical material attributes, critical process parameters, critical quality attributes. The CMAs, if you will, for the fermentation step might include the monomeric sugar content, the presence and concentration of pretreatment byproducts that showed up during the chemical deconstruction step called pretreatment.

It would also include the inorganics that may have been introduced to the system, either intrinsic from the biomass or from the harvest storage transport process. That fermentation is going to be operated at a specific temperature, with a specific feeding strategy, with a specific reactor design, and using a media composition that’s fixed. Those would be the critical process parameters. Out of that comes the key product parameters, right? Titer, rate, yield. Those are the key things coming off of fermentation, also maybe the presence of residual substrates, the sugars that didn’t get converted, or byproducts, something that the microbial conversion process made that we didn’t want to make. That would be kind of a more specific example of how quality by design is used in our biomass value chain.

And I pointed this out not only since it’s my background; it’s also some of the research that we’ll be talking about just a little bit later in the presentation. So enough about the background of the FCIC. How are we organized? We are eight high-performing tasks working across the value chain: feedstocks, preprocessing, and conversion. We start with feedstock variability, which is developing tools to quantify and understanding the sources of variability in those biomass resources. Essentially what Mother Nature gives us, but also what gets brought in during harvest and storage. Farther down the value chain we have the preprocessing task, which is developing tools to take those heterogeneous biomass resources and turn them into more homogeneous feedstocks for the conversion tasks.

We have two conversion pathways: a high-temperature conversion pathway, looking at pine and residues into catalytic fast pyrolysis and hydrotreating; and then a low-temperature conversion pathway, looking at corn stover conversion to a number of products, starting with deacetylation and mechanical refining, moving through enzymatic hydrolysis, and then microbial upgrading of the cellulosic sugar stream to three different products, and then upgrading the lignin-rich stream from deacetylation to other products. We have a materials handling task, which is developing tools, mostly modeling tools, to solve the issues associated with the continuous, steady, trouble-free feed into the reactors; and the materials of construction task, to look specifically at the issues that are associated with where in the unit operations that make up a typical biomass conversion processes.

The two enabling tasks—we have a scientific data management task to ensure the data being generated from these other research tasks are curated, stored, and made available to external stakeholders, and then a crosscutting analysis, or a TEA/LCA task. It is a hallmark of BETO-funded projects (Bioenergy Technologies Office) that techno-economic and life cycle analysis are an intrinsic part of the project, not only to assign a value to the research that’s being generated, but also to work with the researchers to identify the key points where more research will add more value. I’ll point out that the leadership of these tasks is spread across all national laboratories, and all the tasks have membership from multiple labs, so it truly is an inter-lab and interdisciplinary research approach.

So that’s my introduction. Let’s spend some time on the task highlights. As I mentioned, the goal of the FCIC is to develop first principles, knowledge, and tools to solve the issues that are associated with variability in feedstocks and in processes. So I’ll be presenting these results in terms of knowledge and tools. Given the time we have today, I can’t cover everything that we’ve been working on, so I would apologize in advance to my FCIC colleagues who are joining us today. If I don’t cover the specific work that you’ve been working on, I apologize, but I had to kind of limit all the topics I was talking to, just to fit this in. Also to the folks who are listening, a lot of the information that I’ll be providing in the next 25 minutes or so has a small little link at the bottom of the slide to the publication that has a lot more details.

It’s a bit of an eye test on these slides, but don’t worry about it. Don’t try to write it down; just take a note of the page number of the presentation. All of the publications that we have done, they are on our website, and at the end of the presentation we’ll have a link to the website, and you’ll be able to get to the publications there. So with that, let’s jump into the technical content of the presentation.

So again, we’ll start with the upstream, the top end of the value chain, with the feedstock variability task. As I mentioned earlier, its objective is to identify and quantify the initial distributions of feedstock critical material attributes—essentially the intrinsic variability that Mother Nature gives us, also including harvest and storage.

So their approach is to cast a wide net of characterization tools to go beyond the traditional paradigm of moisture, ash, particle size, composition. So that’s where the feedstock variability task—I’ll point out this has actually researchers from one, two, three, four, five, six different labs working on it. Let’s start talking about a literature review article that they published earlier this year in a special issue of ACS Sustainable Chemistry and Engineering. This was an issue highlighting feedstock variability, and researchers from the FCIC have a number of papers in this issue. But as I said earlier, typically the variability has been looked at in terms of ash, moisture, particle size, composition. But biomass, we all know, is more complex than that. We also need to start looking at the physical and mechanical properties that gave rise to the handling issues. So this work surveyed advanced characterization methods that could help improve our knowledge of the chemical, physical, and the mechanical properties. It’s an excellent resource or starting point, if you will, for stakeholders working on this topic.

So then let’s move into the characterization with some laboratory work. So in this work, researchers from the feedstock variability task used an instrument called an analytical pyrolysis multidimensional GC to identify and quantify the breakdown products that are produced during the degradation of biomass—corn stover bales. We all recognize the difference between a pristine bale, a mildly degraded bale, and a severely degraded bale—not only the dry matter loss, but the difference in the fact that the bale handles differently. This work was the first step toward quantifying the organic products that get produced during that degradation step. The work showed that this biological heating, which is the source of the degradation, disrupts the cell wall structure more so for the hemicellulose than the cellulose chain. But these researchers were able to go from a qualitative assessment of degradation to a quantitative assessment of, “These are the specific compounds that get produced, and these are the concentrations of these compounds.” Again, this is a laboratory-based system, but it is a way to rapidly assess and starting to quantify the impact of degradation on stored biomass.

So that was the organic traction, but we all know the inorganic traction is also important. In this work was a first-occurring study to look at the inorganic species and how they change in biomass as a function of storage. This work involved mapping of where these inorganic species are in the tissues. Collectively, these data suggest that the biological heating process translocated some of the elements, specifically silica, from one tissue type to a second tissue type. As this work will go forward, this could have significant impacts on how the biomass gets handled. If we are moving some inorganic species to outer layers of the biomass, that could have implications for where in the biomass handling hardware throughout the process.

So both of those last two slides were really interesting, but they were laboratory-based. What is also needed are rapid, low-cost ways to characterize bales, if you will, in the field. I think this is a critical enabling technology for the team. So this work, researchers from the Task 2 used a very low-cost RGB sensor, essentially a cellphone, and correlated images to the glucan and silica content. And they were able to use these data to measure the extent of degradation. This is very interesting work, and it could be used by farmers for in-field assessment, for online process control throughout the harvest storage, and preprocessing. And this is a beginning of this work, but as we move forward, we think this could be a really useful tool for industry.

OK, after that task we move into material handling. So the objective of this task is to look at the flow and transport of biomass. Now, this is a particularly problematic issue in the workshop data we talked about earlier. There were lots of process upsets because biomass didn’t handle, didn’t transport, didn’t store the way traditional agricultural commodities did. So the material handling task is taking a combined modeling and experimental approach to solve these issues. Overall, again, process upsets were critically important. So, again, combined experimental and computational work at multiple scales. I’m going to take a little aside right here. We’ll be talking a lot about computational work, modeling work, in multiple tasks within the FCIC.

And I want to point out that this task and all the other ones have a validation activity, not only to parameterize the data that’s going into the models, but to validate and determine the range of applicability of that model. And any of these computational models that don’t have validation may give a national laboratory researcher like me a great publication, but in the absence of validation, it limits the applicability of that tool to our stakeholders. So at the risk of standing a little pedantic, I want to assure everybody that all of the modeling work that the FCIC is doing is backed up and tightly integrated with experimental work. With that, we’ll move into the first step.

So this work also started with a couple of journal articles that laid the groundwork, or put a stake in the ground, on what the state of the art in this modeling area. Now, there’s two different approaches to modeling biomass: the discrete modeling, DEM, or the computational fluid dynamics or continuum mechanics, the CFD approach. The DEM is computationally intensive. Each particle is described independently, whereas the CFD treats the biomass as a continuum—less computationally intensive, but it does depend on how you’re describing what the constitutive equation is to describe the behavior of that continuum fluid. And these two papers also serve as a resource or starting point, not only for the FCIC researchers, but for other folks who are working in this area. Let’s talk again about the CFD, or the continuum mechanics. So this work, the researchers on the material handling task looked at four different continuum particle flow models of these constitutive models, if you will, to put them into a CFD model, and then validated against some experimental work.

Again, the CFD models are easier to solve than the DEM models, but they do require a good constitutive model. That’s what this work helped provide. This work will be offered for public use. We do have an open-source process, and at the end of next calendar year, or the end of next fiscal year, this spring and next summer, we’ll be open-sourcing the code that we’ll be talking about today, so not only in this task but in some other tasks as well. DEM, again, more computationally intensive. One thing that the national laboratories have is a lot of high-performance, HPC, capability. So in this work, the researchers used X-ray computed tomography, XCT, to actually develop realistic particle shapes, not just cylinders, and used these in what is called a coarse-grained DEM model—so, computationally intensive, but not terribly so.

This first work gives us, if you will, a virtual laboratory to understand and model the biomass particle processes, and what those mechanics look like. And again, this work will be open-source as well, as soon as the publication comes out. And again, at the risk of being a little pedantic, this task has a multiscale validation characterization team working on that, so we’re doing real-world experiments, both in the laboratory and in the pilot plant, to both parameterize and to validate all of the modeling work. For example, can we predict the arching behavior in one of the computational models, and does that agree with the experimental data that we can generate in the pilot plant? So, fine, let’s move on to preprocessing.

Now, preprocessing covers all of the unit operations between the storage step and conversion at high temperature or low temperature. The preprocessing tasks within the FCIC is focused on milling, the breaking down of the particles, and fractionation—the separation of a biomass stream into different constituents. Because both of these processes are reasonably far upstream, they do impact all of the conversion work, and that’s why it’s important that we focus on this issue as well. Let’s talk a little bit about deconstruction theory and practice. The researchers on the preprocessing team developed some DEM models to predict the fracture of real biomass particles—in this case it was pine particles—take some lab-scale data on stress/strain relationships, include that into the DEM model so that we can predict the breakdown of individual pine particles.

This DEM model will get integrated into a larger model of the comminution equipment itself, of the knife mill, to give us a way to predict, for example, the particle size distribution coming out of a knife mill based on the mechanical properties, the physical properties of the biomass going into that mill. Essentially a virtual laboratory for the mechanics of milling, which is really cool work. Again, the previous work used laboratory-scale data to get those stress/strain relationships, those mechanical and physical characteristics of biomass. This modeling work I’m going to talk about here is focused on trying to understand, complementing that earlier modeling work, to try to determine where these emergent mechanical properties one talks about, the toughness or the hardness or the stress/strain relationship of a biomass particle—you don’t think about that in terms of the hardness of a lignin molecule. So what this work is trying to do is to predict the emergent mechanical properties from the microstructure and the composition—so what molecular-level properties govern those emergent properties. Again, this is computationally intensive, uses a lot of HPC. But over the course of a year or so that this work has been going on, these biopolymer components have gotten put together in larger structures and larger structures. This year, in 2021, these computational structures will be large enough that we can validate them directly with nano-indentation experiments. And what we hope to do is tie this work into the work in the previous slide so that we can have a first-principles-based predictive tool for biomass properties.

I want to just talk a little bit about air fractionation. This promises to be a really low-cost way to separate biomass streams based on physical characteristics. These differences show up in preprocessing, but as you’ll see later on in the talk, they also show up in conversion. Now, the process development work for this area is being done at Idaho National Lab under separate funding. Where the FCIC fits into this work is we’re characterizing the properties of these different fractions and specifically focusing on how those differences impact downstream conversion data. So we’ll talk a little bit more about that a little bit later.

OK, one last thing I want to touch on the preprocessing task, real-time feedstock image analysis. This is a really important tool to be able to see what’s going on in the field. Researchers on this task took a very low-cost camera, collected several thousand images of biomass as it moved across a weigh belt feeder on a corn stover pretreatment run in the pilot plant at NREL. They took those data and they correlated it with process data. We have a very well-instrumented pilot plant at NREL, and they correlated those images with data coming from torque sensors. So when a motor that’s moving the belt or moving the plug screw feeder or moving the biomass through the reactors has an increase in torque, it means the biomass is harder to process.

And using some pretty interesting neural network tools, they built a classification tool to be able to classify the biomass images as normal or likely to cause problems. Those anomalous images could then be used—you could feed that right into a control strategy to warn the operators and to warn the equipment that biomass with undesirable properties is coming down the pike.

OK, fine. Now let’s move to the high-temperature conversion task. Again, we’re looking at the catalytic fast pyrolysis of pine, followed by hydrotreating. There’s, again, five different labs working on this task. Each brings their own unique expertise, from leadership in the computation area with Oak Ridge and NETL, to the preprocessing folks at Idaho, to the conversion folks at NREL and ANL, and the characterization folks across those laboratories.

Again, we talked a little bit about modeling, and modeling in the high-temperature conversion areas operates at different scales. There’s a molecular scale that focuses on the intrinsic kinetics. There’s a particle scale that focuses on the intrinsic heat in mass-transfer issues that are associated with particles. And then there’s a reactor scale that takes those particles and includes them into a larger model. Now, because of the way this works, the computational models at the molecular scale or the particle scale can’t be integrated directly into that reactor-scale modeling. So the researchers developed what is called a reduced-order particle-scale model that could then be taken into the reactor-scale modeling. So it’s still faithful, if you will, to the lower-scale modeling work, but now can be used in these larger-scale, reactor-scale modeling.

And this was implemented and validated in the open-source CFD suite MFiX, which is available from our colleagues at the National Energy Technology Laboratory, NETL. And again, this module is freely available, in the open-source MFiX suite. So let’s talk a little bit about those reactor scales. Now, pyrolysis of biomass is done in a fluidized bed reactor, in the FBR, and sand is often used in that FBR as a heat-exchange medium. But as you can imagine, biomass and sand have dramatically different thermal and mechanical properties, right? They’re not the same thing. This contributes directly to the instabilities in those reactors, which can be catastrophic to the operation of the process. If my fluidized bed reactor loses its fluidization and crashes, it takes the entire process down until we can refluidize that bed.

So this work used some advanced image analysis work to capture the physical and mechanical behavior of the sand and the biomass phases. The folks then took a hybrid drag model, which captures the key mixing effects, to create a better model that can be used to make better predictions and to understand and prevent these instabilities. Finally, let’s talk a little bit about the chemical kinetics. Now pyrolysis, many of you will probably be aware, oftentimes it’s treated as a black box. There’s no detailed kinetic model because the biomass is so heterogeneous, right? We got cellulose, we’ve got hemicellulose, we’ve got lignin, we’ve got inorganic species.

So researchers implemented a validated detailed model on pyrolysis from the literature, and then put it into this modeling framework we just talked about on the last couple of slides, and then were able to use that reactor-scale model to do a sensitivity analysis to see what’s important and what’s not. Now, they showed that the yield of bio-oil is very highly correlated to the oxygen-rich lignin, but they also identified where in that kinetic model the largest uncertainties are, which of those parameters are very influential in the overall predictions, and therefore need more resources to have focused on them. And a lot of this work will be validated in the coming months at laboratory-scale pyrolysis experiments that will be occurring.

Low-temperature conversion task—again, this is the one we talked a little bit about early. Like the high-temperature conversion task, it’s focused on predicting the performance of the conversion process based on the properties, the attributes of the biomass entering the process. Unlike the high-temperature pathway, the low-temperature researchers are using a machine-learning or artificial intelligence approach to do that modeling work. Again, this is also a little bit different. We are doing the deacetylation mechanical refining, followed by enzymatic hydrolysis, to make a cellulosic sugar stream that is upgraded, and then the lignin-rich stream coming off the deacetylation is also upgraded.

So what I’ll do now is talk a little bit about the experimental data, and then the modeling framework that it’s going to be tossed into. Because the work to date used a standardized set of conditions for this pretreatment step, the deconstruction step, DMREH, not only was it used to produce these streams for biological upgrading, it also served as a screening tool to look at the impact of variability on the deconstruction step. So the researchers showed that different anatomical fractions of corn stover—the cobs versus the husks and leaves versus the stalks—were deconstructed differently, and saw statistically significant differences in both the lignin and the cellulosic sugar yields. I think this argues in support of a differential pretreatment.

Recall I mentioned a few minutes ago that fractionation work that’s going on in preprocessing. Finally, let’s talk about the core experimental work of the low-temperature conversion task. After we get the cellulosic-rich sugar streams and the lignin-rich stream, they are microbially upgraded. We’re looking at three different pathways for the cellulosic sugar upgrading, and then a fourth pathway for the lignin stream upgrading. This work has shown, again, statistically significant changes in biocatalytic productivity—the rate, the titer, the yield—as a function not only of that anatomical fraction differences we just talked about, but also on whole corn stover samples that had different levels of inorganic material and different levels of moisture during its harvest, storage, and preprocessing.

The purpose of this work is not only to identify those differences, but also to feed the data into the modeling portion of the work. And let’s talk a little bit about that. Again, the modeling portion of this task is looking at artificial intelligence modeling to predict the performance of these microbial conversion steps based on the attributes of the biomass coming into the process. They’re using these metabolic models that are available in the literature. They’re embedding them into the AI framework and populating it both with the literature data and all of the experimental data that I just talked about on the last slide. Again, the overall goal: predict the performance of new organisms to new feedstocks.

Now, because AI, as many of you know, is a data-hungry approach, in the coming months the low-temperature conversion task is focused on generating a substantial amount of laboratory data to feed this model. And some of those publications will be coming out in the coming months on that. Finally, the last experimental task is the materials of construction. I think this is a great example of cross-laboratory collaborations. We’ve got subject-matter experts at Oak Ridge and Argonne joined to biomass experts at Idaho and NREL. And again, this work is focused mostly on understanding the fundamental causes of wear in processing equipment, and then to look at mitigation steps. So let’s start with some experimental work.

In this work, the researchers on this task showed dramatic differences in wear rates among different anatomical fractions of hardwoods, of pine. And they demonstrated that it’s largely caused by the presence of extrinsic inorganics. The inorganics that get brought into the biomass stream during the harvest, during the transport, during the storage. They were much more abrasive. And because this analytical method preserves the actual composition of these species, we were able to identify which specific species are causing most of the wear. The standard ash measurement, which can be very useful for many purposes, is a combustion method that turns all the inorganics into their oxides, and so this technique was able to keep the composition of those species intact for the work.

So that work, and some other work that I don’t have time to talk about, showed the impact of those inorganic materials, and which materials might be more useful. But until you have a modeling framework to essentially hang the laboratory work on, it limits the utility of those data, as good as they are. So in this work, researchers from this task took a well-understood erosive wear model that’s very well understood, very well accepted, and were able to predict the data coming from an accelerated wear testing piece of equipment so that we can capture both the influence of the material of construction, of the material, but also the biomass material, the inorganic species, and actually the geometry of how the wear was done. This is also the subject of an upcoming manuscript.

But again, by adding the modeling component to the experimental component, it builds a much more powerful tool for industry. Finally, the crosscutting analysis task. Again, as I mentioned, TEA/LCA are hallmarks of BETO-funded projects to valorize the work that’s being done, to identify future research objects, and to help identify and rank the problems that each of the experimental tasks are working on. The main output are case studies; I’ll talk about two of them in just a second. And I’d also like to point out that the modeling frameworks that are being used in this task are being leveraged from the frameworks that have been developed by BETO-funded research for the last decade or more. So we’re taking advantage of that knowledge and applying it directly to the impacts of feedstock variability.

So this is a modeling study on the impact of the cost of bio-oil measured as the minimum fuel selling price as a function of a number of material attributes going into the pyrolysis process. Again, this is a modeling activity, but they looked at the influence of mineral matter composition, the moisture content, the particle distribution, the extractives content, and then not only the reactor temperature and how that fed into a larger Aspen model to then predict the performance. And the net-net of this is that if we do nothing to control the natural variability of inorganic materials that’s seen in biomass, we can see up to a 20% range in MFSP if we do nothing. This work will be validated with more experimental work in early 2021.

Also point out that of all of those material attributes that the modeling work looked at, the inorganic species was by far the largest contributor to this variability. So that was the TEA. Let’s talk a little bit about the renewability impacts, and this is work coming out of our colleagues at Argonne. And they looked at dynamic life cycle analysis to look at the impact of particle size on renewability metrics. As many of you are probably aware, smaller particle size going into pyrolysis will give us an improvement in fuel yield. That’s the red line on this plot. As particles get smaller to the left of that plot, the yield goes up. Unfortunately, so does the energy required to do that size reduction, to do that comminution. So at smaller particle sizes, we get higher overall yields.

We also get higher use for electricity. The purpose of this work wasn’t to pick an optimum, but to demonstrate the conflicts that have to be balanced between the renewability aspects and the techno-economic aspects. Finally, we have a data integration and quality by design task. Task 4 is building database tools to integrate all of the attribute work we’ve talked about from across the FCIC. We are using a LabKey environment; it’s an open-source data hub that can be used. Think of it as a database-enabled website. And we are providing this to both our internal FCIC researchers, but also to external stakeholders, and we’ll be rolling out this external portal later this fiscal year, early 2021. Finally, many of you might be familiar with the Bioenergy Feedstock Library at Idaho National Lab. I’ve used it to pull data out of.

I should confess I’ve also put data into it on some of my other research projects. So we’re not trying to, in the FCIC, recreate anything that’s already been done. So this work is working to increase the operability between our LabKey environment and the BFL, so that we are leveraging that work, not trying to replace it. OK, that’s all the technical, guys. And in the few minutes we have left, I want to talk about our industry advisory board, a couple of other things. So I’ll go quick. Industry advisory board—I won’t try to read all these names. I’m assuming everybody here can do that themselves. I will point out that our industry advisory board is small but mighty. It is incredibly diverse, both in terms of background—we have academic folks; we have industrial advisors. In terms of which area of the value chain, they have expertise.

We have experts in feedstocks, in preprocessing, and in conversion. We have experts who work in herbaceous materials and woody materials, and we have experts in the high-temperature and the low-temperature conversion pathways. So it is small but very diverse. They have been incredibly effective. They’ve been weighing in on potential case studies that the Task 8 folks will be working on in the upcoming year, and they’re also engaged in specific task work in their respective areas of expertise. So I want to give a shout-out to our industry advisory board. We really do appreciate the time that you take to help us out. We did do a directed funding opportunity a couple years ago; I’m not going to cover that at all, but I wanted to point out that we are working directly with industry on a number of projects—everything from feeding issues on small-scale gasification systems to online acoustic water measurement for real-time control. This work is really interesting, and we can provide some more information. As you know, we’re coming up on peer review in the spring, and there’ll be some more information on these projects then.

Finally, I did mention the publications. We are publicizing a lot of the details of our work in multiple ways. We’ve got over 29 pubs from fiscal year 2020. They’re available on our website. I think exactly one of them has my name on it. I was able to do a little bit of technical work this year. And finally, I’ll just close with some contact information. Again, my name is Ed Wolfrum. I serve as a principal investigator, and here’s my contact information.

My colleague Amy Sluiter serves as the FCIC project manager. Here’s her contact information. And finally, I want to open the floor to questions from the chat, but I also do want to point out—this picture shows a number of the researchers who are working on the FCIC, again, all across the value chain. Before anybody asks, yes, this picture was taken before our current COVID crisis. But I do want to shout out and I do want to acknowledge those folks. I also want to acknowledge our BETO stakeholders, Bo Hoffman, Mark Ellis, Liz Moore. So we’ll take questions. If we run out of time, there are three ways to get some more information besides the questions in the chat. We’ll be having a year-in-review report that’s going to drop onto our website shortly after the holidays.

It’s in the publications approval process now, and that’ll give a little bit more detail on all of the work I talked about. We do have our publications on our website; if you want a lot more detail, you can look there. Or you can drop us an email, and we can connect you directly with the researchers doing the work you’re most interested in. That may be the fastest and most effective way to answer your questions. So I will be looking in the chat box. And we will also be putting up a little poll of our own, and we would really appreciate answers to those structured questions, which I’m seeing. That’ll be open for the next half hour, so if you have more time, you can go in there. But we’d love to get some input there, and if you have any other input, that would be great.

And I’m going to get through these questions. Again, my colleague Amy Sluiter and I’m looking at these. One specific question: “The research looks great,” thank you very much, “but it does feel that if a lot of the commercial value will end up in the pockets of large projects as opposed to small, medium, rural locations won’t be able to afford the CapEx of complicated processing.” I think that’s certainly a valid concern. Again, I would say that by using a first-principles approach to understand the real influences on the issues that have been seen, we hope that might indirectly help out processors of all sizes. But I agree that the CapEx of complicated processes do favor the larger-scale folks. But again, our role I think is to get the scientific underpinnings of the problems. But it is a good thought, and we do need to keep that in mind.

“What are the remaining gaps for FCIC and others that should be addressed to make the high-temperature and low-temperature biomass conversion processes commercially feasible?” I think we are looking across the value chain, where were think, again, that our value, where FCIC is adding value, is in the identification of the key attributes, so that we can understand how biomass handles, for example; how it’s converted; what are the changes that occur when it’s storing. I can’t think of a specific gap. We all understand the economic challenges that have been caused by going on a decade of relatively low energy costs. But I would point out that by going to an attribute-based approach for looking at feedstocks and processes, that will enable us to focus, or the stakeholders to focus on, for example, cost-advantaged feedstocks that might be very useful.

So we can take the learnings that we presented for these two biomass materials we’re looking at, and apply them directly to feedstocks that may have a better cost advantage. Next question. “Can we use system output product efficiencies and properties as a feedback parameter to detect, diagnose, and restore one or more unit operations?” We are looking at feedback loops. I apologize; I don’t completely understand that question and we’re running out of time. But maybe that’s something we could take offline. “How do you interface with the potential symbio and biomass processes? Who leads this, and where is it covered?” I think one of the things that has really worked out really, really well with the FCIC, and I can’t take credit for it, is that everybody’s talking to everybody else.

We specifically have a lot of interaction between the research on the Task 7 low-temperature conversion pathway and the BETO-funded Agile BioFoundry, ABF. So there are, I want to say, almost everybody on the FCIC low-temperature conversion task, led by Dr. Phil Lively at Argonne, is also involved in the Agile BioFoundry. “Have there been any in-depth investigations into coproducts generated specifically for carbon capture by the FCIC?” No, we haven’t done any carbon capture work. That is an area of research that BETO is working on, but it’s not within the FCIC portfolio. And I apologize, we’re almost running out of time here.

“Is there within FCIC the topic of preprocessing and technology development? The first thing that comes to mind might be torrefaction.” Excellent question. The FCIC’s been focused more on understanding the impacts of variability on processes that are already being developed or being researched by core BETO-funded projects. So no, specifically we’re not doing process development as much as we are understanding the first-principle causes of the variability that impact those process developments. “Looking at the cellulosic plants that have been less than successful, can you identify a single point of failure common to all of these?” I think the short answer to that is no. The longer answer is I’m probably not in the best position to answer that question, but we can certainly follow up and connect you with folks who would have specific opinions on that.

And with two minutes to go, “Are there opportunities for collaborations and develop proposals?” Right now this is a national laboratory project. We do have some external collaborators, but we don’t currently have plans for any proposals coming out in the short- to mid-term time frame. Amy, she’ll collect the rest of the questions, and we’ll gather them, and we’ll get back to everybody. So thank you very much. I can give everyone about 45 seconds of their morning back. But again, thanks to our BETO stakeholders. A huge shout-out to the researchers who have done all this work. And thank you all for taking time out of your day to join us today. And also, one last thing—please fill out the poll. If we’ve said anything here that’s interesting, please fill it out.

We’ll take all the feedback you have, positive, negative, middle of the road. Thank you very much, and everybody stay safe and have a great day.

[End of Audio]