Optimization of Overall Processes in the Chemical Industry

When planning chemical plants, a large number of specifications, settings and objectives have to be taken into account. For example, operating and investment costs should be kept as low as possible, while at the same time producing products of the highest possible quality. In addition, there are environmental and safety requirements that must be met.

To meet these requirements, the process engineer must not only compare the various settings of a plant in his planning, but also different plants that produce the same end products from the raw materials in different processes. This is a difficult task, especially for complex plants with a large number of apparatuses.

This planning process is done with the use of computer-aided simulations, in which the best possible settings and plant designs are found using the engineer's experience and empirical search. However, given the complexity of the problem, the best settings for the desired goals and objectives will not be found without a transparent optimization strategy.

A New Approach for Planning Processes

In this project, a new approach to the design of chemical production plants is being developed. Only those solutions are considered and analyzed further which constitute best compromises between the different objectives, while respecting the restrictions. This set of best solutions is generated automatically for an interesting parameter range and then presented graphically to the engineer. Thus the engineer can make a rational decision based on the knowledge of the complete range of best solutions.
In the examples analyzed so far, this procedure revealed parameter ranges which have not been taken into account before within the empirical optimization. Solutions in these ranges are much better than those found empirically. Additionally, the planning time has been reduced significantly.

 

The Workflo Process Engineering
© AdobeStock / Fraunhofer ITWM
The Workflow Process Engineering

In addition to rigorous flowsheet simulation, shortcut methods help to obtain a first quick overview of the solution totality. Solutions obtained in this way perform significantly better than those found empirically and reduce the planning time considerably.

Definition of a Data Interface: INES – Interface Between Experiments and Simulation

The rigorous models must be calibrated using process data to make reliable, realistic predictions. In the INES project, a user-friendly interface to historical process data was created for BASF's flowsheet simulator.

The reliability of the data is assessed here by three criteria:

  • No outliers
  • Stationary intervals
  • Fulfillment of mass balances

Outliers are removed interactively. This allows the user to perform outlier detection adapted to the context (measurement device, error sources). Stationary intervals can be obtained by means of a heuristic segmentation of the data series. In this approach, so-called breakpoints are set where the mean values of adjacent intervals differ most significantly statistically.

The fulfillment of mass balances is handled in the context of an interactively configurable data reconciliation, which considers components and reactions in addition to the overall balances. This allows the rapid detection of systematic errors in the balances that are due to leakage.

German Video: Resource-Efficient Production in the Chemical Industry at Fraunhofer ITWM

Privacy warning

With the click on the play button an external video from www.youtube.com is loaded and started. Your data is possible transferred and stored to third party. Do not start the video if you disagree. Find more about the youtube privacy statement under the following link: https://policies.google.com/privacy

Whether in agriculture, industry or private households, chemicals are needed everywhere. However, their production consumes an extremely large amount of energy. With a novel hybrid approach, energy can be saved in the double-digit percentage range, depending on the plant and process. »For our analysis, we brought two things together: first, the laws of physics, which we represented in a model – in other words, expert knowledge of the thermodynamic and chemical processes. And second, the data that various sensors determine about the measurement process, for example, about temperature and pressure. We use these where no physical data is available,« explains Karl-Heinz Küfer, head of the division »Optimization«.