Laboratory Testing & General Mineral Processing Engineering

Laboratory Testing & General Mineral Processing Engineering

  • To participate in the 911Metallurgist Forums, be sure to JOINLOGIN
  • Use Add New Topic to ask a New Question/Discussion about Mineral Processing or Laboratory Work.
  • OR Select a Topic that Interests you.
  • Use Add Reply = to Reply/Participate in a Topic/Discussion (most frequent).
    Using Add Reply allows you to Attach Images or PDF files and provide a more complete input.
  • Use Add Comment = to comment on someone else’s Reply in an already active Topic/Discussion.

Modeling and Simulation of Mineral Processing (28 replies)

(unknown)
8 years ago
(unknown) 8 years ago

"The journey is more important than the destination" Trying to build models of a process or product, forces you to ask questions about this process/product that you perhaps wouldn't have asked otherwise. Even if your model, once in place, isn't very helpful because of uncertainties etc., you might have learned a thing or two, about the process or product you are trying to build a model of. Measurements is of course a part of this journey. Today I work with engineering simulation not related to mining/minerals, but this rule applies just the same.

(unknown)
8 years ago
(unknown) 8 years ago

This is the first time I am learning the simulation of processes especially the concentrator. I feel the models can open clear doors to problem solving and thereby easier to trouble shoot a problem in a process plant. i am eager and looking forward to learn more in process models.

(unknown)
8 years ago
(unknown) 8 years ago

If the modeling / simulation OR measurements don't lead to some change - in our plant or knowledge - they accomplish little to nothing!

Some useful applications (personal experiences):

1. Improved monitoring and stabilisation of grinding and flotation circuit performance at various operations

2. Improved understanding from modeling leading to dramatically different standard operating conditions and improved recovery.

Jean Rasczak
8 years ago
Jean Rasczak 8 years ago

From my experience, proper measurement, analyzed by qualified individuals, leads to useful models. Models not built on real world measurement often lead to costly mistakes. So my answer would be that I would reverse the question to read, "What have measurements and models ever done for us?" and then reply, at least in my case as an energy efficiency consultant with a mineral processing background, that they have allowed me to deliver tens of millions of dollars of cost reduction to my employers and now clients. But beware of models not rooted in real world experience.

(unknown)
8 years ago
(unknown) 8 years ago

Models are not perfectly real but results to measurable optimization

We have found modeling to be very useful in early assessments on the hunt for optimization. All of our models have been developed in house, as they are adjusted frequently as data is collected from actual results. Mostly they are, for us, a starting point to determine a direction of adjustment.

Our modeling is for media size selection to meet mill discharge particle size targeting only. These models have a proven record to have a great deal of accuracy for this application. I do accept that our modeling is far less complex than floatation modeling, as the initial data we require are always hard numbers.

(unknown)
8 years ago
(unknown) 8 years ago

What have the Romans ever done for us?

•The aqueduct

•Sanitation

•The roads

•Irrigation

•Medicine

•Education

•Wine

•Public safety

•Public Health

Yes... all right, fair enough...

https://www.youtube.com/watch?v=ExWfh6sGyso

(unknown)
8 years ago
(unknown) 8 years ago

This provides an opportunity for people to discuss how mathematical models are used in Minerals Engineering. Perhaps the question was too broad.

Anyway, I was pondering the question, and considered what the first applied mathematical model was. The earliest I could find was about 600 BC with Pythagoras' theory of music. I would suspect that some of the Babylonians may have well have used mathematical models for irrigation.

However I think the deeper issue, and perhaps what is being eluded to be that in our industry there may be many practitioners who do not see that mathematical models are of value.

I had an alternative discussion in Mineral Processing Innovation and asked the simple question 'How many mineral processing engineers can write a computer program'. Naturally the response varied, but many claimed that basic software skills served no purpose to their jobs.

As software is a way by which mathematical models can be applied, I think it is fair inference that those same practitioners did not see the value in implementing mathematical models (other than those that were not already available to them by existing software or consultants.)

One of my pet hates is where a position is advertised 'mathematical modeller' and the requirements do not include much maths. That is a mathematical modeller is someone who implements an already existing mathematical algorithm (or set of algorithms) using a packaged software system.

But at least these people know that what they are running is based on maths.

The main problem I see (and please disagree) is that many people can run complicated models, but have little understanding, nor are required to have understanding of the basis of the models. Thus we have 'ignorant experts'.

My favourite quote in a course I ran on simulation was a participant who asked halfway through the course "by the end of the course will I be an expert?"

This is something I see all too often. People do one week courses and then claim to be experts.

And job advertisements enforce this concept. They don't ask 'do you understand the subject area?' but are you skilled in running the following software?

(unknown)
8 years ago
(unknown) 8 years ago

We've all seen the uncertainties in the process models, but we need to remember that models are very good for the structural aspects inherent in minerals processing. When I was at Northparkes we had a rill tower structure fail due to wear, the consultant mechanical engineers were able to predict the thickness of the remaining steel within 1/2 a mm (had worn to 4mm from 12m original installed thickness). After the repairs, those parts could be tested annually and are still being used safely in production. Without models a knee jerk response might have been to remove the structures and use a different feeding arrangement. Can anybody imagine building a large mill shell without detailed FEA? To answer the OP we've got lighter and bigger steel structures due to models.

Zander Barcalow
8 years ago
Zander Barcalow 8 years ago

Good, robust, reproducible measurement is the basis of all scientific endeavours.

With good measurement we can establish a base line of performance, and then measure any change or improvement against this baseline.

Good, robust, reproducible measurement is essential for the Scientific Method.

And there should be more of it!

Measure, Measure, Measure and this will show the natural variation in your process.

Then you will know what you are up against when trying to quantify what is a significant change in performance.

I probably have a different view about modelling to most people, in that modelling is very good at framing the question you are trying to ask, and then making you collect the relevant and good, robust, reproducible measurements to allow you to do the modelling.

The measurement data are more important than the modelling.

And you need lots of measurements to understand the distribution of the data.

Often what is easy to measure, is what gets measured, but it may not be the variables that are important to control the process.

And what is measured is normally what is controlled, so you may be pulling the wrong levers and controlling the wrong variables when trying to improve the process.

E.g.: It have seen level control on pump boxes to be tuned very tightly to control the level very accurately, as this instrument tech saw this as the objective, so every variation in flow was immediately past downstream. The benefit of having a pump box is to use it as a surge tank and adsorb the variation in flow, so there is a steady feed to the downstream process, so it should be tuned to adsorb the variation and improve the overall process. The wrong variable being controlled well.

(unknown)
8 years ago
(unknown) 8 years ago

The understanding the accuracy of the sampling and measurement methods and where the errors may occur and the magnitude of those errors is a very important factor in checking the inputs to any model or investigation. Also you need to be able to check that the measurements look correct before being used.

I often see people trying to balance a circuit where the feed assay is not between the con and tail assays. And then wanting to use a mass balancing program to fix this, when they need to understand where the error may have occurred and then go and re-measure it correctly.

People are often unable to review data for correctness and errors, due to poor understanding of the process and techniques being used. I think this skill is required before you start modelling.

Otherwise as most people say about models, rubbish in produces rubbish out.

Modelling is good at making you understand the process being investigated, and determining if you are asking the right questions and collecting the relevant measurements?

The methods of building models have given us a better understanding of how our equipment and processes work and what are the important variables that need to be measured and controlled.

This has allowed us to have great improvements in plant control and optimisation.

But I’ve seen plants where the only reason they have a full plant survey of the grinding circuit or the flotation circuit is because someone had wanted to use JKSIMMET or JKSIMFLOAT and got consultants to conduct surveys to feed the data into the models. But the measurement data tells you a lot more about the plant operations than the modelling work that was being conducted.

At least this achieved one good survey to quantify plant performance, but it did not tell you anything about the variation in performance with time, so it was assumed that there is not variation. – An untested assumption.

Empirical models are sometimes good at the area where the baseline has been measured, but as you move away from that baseline the errors (incorrectness) can become quite large.

Are you using Interpolation between two known, measure conditions/points or are you using extrapolation past the known data set? The first is more likely to be correct then the second.

Marshal Meru
8 years ago
Marshal Meru 8 years ago

Understanding how the different models work and what they are based on, is very important to know if you are modelling what is reality or is it just a non-feasible and imaginary data set.

Understanding the boundary conditions and limitations of model are essential to know if you are extrapolating past the point of validity.

Some models will tend to overestimate or underestimate the magnitude of the change depending on the type of model it is and the math used in the model. – Once again understand the limitations of the model. If you think there are no limitations, then you do not understand the model.

Anyone can run a model – but how do you validate it for correctness?

You need to already understand what the model should be doing based on the math and structure and also what happens in the actual process, to know if the model answers are correct or just an artefact of the math.

So if you do not know what the answer should be approximately before you start, how will you know you have a valid answer?

In my view models normally need to be validated by more robust testwork and lots of measurements of predicted versus actual performance. – i.e.: do more testwork and take more measurements.

Modelling can be a tool to help you frame your question and collect the correct measurements and data. – But also needs work on the validation of the answer. – Modelling can help you to define what your hypothesis is.

Scientific Method:

Ask Question, Review data (and literature review), Make hypothesis, Design experiment to test hypothesis, Analyse results, and Test hypothesis against results, Repeat.

And so in conclusion:

“Apart from better sanitation and medicine and education and irrigation and public health and roads and a freshwater system and baths and public order... what have the Romans done for us? 

(unknown)
8 years ago
(unknown) 8 years ago

There have been some excellent comments already on this topic. I particularly like the argument that a few people have made that building a model encourages, and often requires, building a better understanding of the process. Before one can quantify different unit operations, he/she must have a firm understanding of the process on more than just a qualitative level.

As far as what have models done for us, I'd like to mention some ways the Automation Solutions division of ANDRITZ is helping clients with its IDEAS Simulation Software. (Our software customers are also doing the same work using IDEAS). To be fair, there are other software packages doing most, if not all, of the same or similar work, so this list can be considered a list of what modeling and simulation is doing for the mineral industry. I will list these activities as they occur during a project lifecycle, be it a completely new facility, or a new process area at an existing plant.

1. Design Engineering with simulation. Quality models offer the opportunity for process designers to evaluate alternative flowsheets on a steady-state level. As design continues, such models may be altered to consider dynamic behavior, and used to evaluate the reaction of the facility to upsets. Also, the modeller/process engineer/control engineer can design and compare the behavior of different control strategies.

2. Design Validation. Once the design has been selected, and engineering completed, models may be used as virtual, full-scale pilot plants to test the behavior of the plant. Where are the bottlenecks? Are all the process design criteria being met? Do our slurry pipe velocities fall within acceptable ranges under all circumstances, including over the course of the mine plan? Have our combined safety factors led us to overdesigning parts of the plant? Etc.

3. Control System Verification. After testing and acceptance of the control system, process simulators may be linked to control system emulators to create a combined Operator Training Simulator (OTS) station. Such systems are typically first used to test the performance of the control system during start-up, shutdown, ore changes, upset conditions, etc. This testing reveals errors and desired changes to the control system, the HMI (operator interface), and to the simulation itself. Some of these errors may be simple tag text errors, while others might cause operational calamities.

4. The OTS system is then used to train operators, much like the airline industry uses flight simulators. Building a knowledgeable operator pool provides more productive, and safer, plant operations. Operators can perform start-ups, shutdowns, ore feed rate and grade changes, handling equipment failures, etc. Such OTS systems build awareness and understanding of the process and the control system - above simple understanding of the cause and effect, but understanding why, and how quickly, change occurs.

5. Operational Improvement. Much as in the initial design phases, a model may help operations consider and evaluate possible changes to the process, and/or process controls.

So, the above are many ways models do benefit the mining industry.

In keeping with the other motif of this discussion, let me close with one of my favorite Monty Python quotes: "You can't expect to wield supreme power just because some watery tart threw a sword at you!"

(unknown)
8 years ago
(unknown) 8 years ago

When discussing models we need to differentiate the following:

1. Equations

2. Mathematical Models

3. Simulation

The difference between an equation and a model is often blurred, but in general one would not and should not call an equation a model.

A model is generally a system of equations. If someone has data and fits a curve through it, to me it is quite a stretch to call this a model.

If one has a number of 'models' for units and they are linked together we now have a simulation system.

At each level of analysis (equation, model, simulation) the basis for the method, the purpose of the method - inclusive of its strengths and weaknesses needs to be fully understood.

This relates to context. A method may very well be useful for one context but terrible for another.

If one does an honest review of 'simulation' (in the context of mineral processing) then there are many gaping holes.

I think the worst common mistake (amongst many) is that model developed for designing a plant is also used for operating a plant.

I noticed this lack of consistency in simulation models many years ago (from day 1 in mineral processing) and was one of the reasons that lead me to develop independent simulation software.

(unknown)
8 years ago
(unknown) 8 years ago

Out of curiosity, in what ways(s) are you (or others) using a model to operate a plant?

(unknown)
8 years ago
(unknown) 8 years ago

The scientific method which is at the heart of engineering is about building, evaluating, upgrading, and using models to represent systems (or parts of systems) to achieve practical aims. An issue lies in what we define as models, development of models, and use of these models. Often people are solely thinking of the maths component. For example, our understanding of the selective flotation of sulfides is based on an electrochemical model for some aspects and a physicochemical hydrodynamic model for other aspects.

A practical reality is that we often see engineering and operating practice deviating widely from the scientific method, including lack of awareness of existing valid models and their effective use.

(unknown)
8 years ago
(unknown) 8 years ago

I have developed my own independent software. In the last 5 years I oscillated between working and going independent. In the last round of independent working I originally just developed a mass balance system with intent of linking to LIMN.

In the end this was discontinued, and I decided to use Visio as the flowsheet system and the mass balance interface was via Excel. Via a project with Rio Tinto I was able to get funding to extend this to what I call a 3D system: Size/particle class (or density class) and elements within each particle class.

I then reassessed market interest in the mass balance systems; and the general response was that it needed to be connected to a simulator. At that time I could not get an appropriate strategic relationship with any of the simulation groups so I developed one myself. However all I wanted to do was focus on a holistic simulator and create a detailed engineering simulator.

A holistic simulator provides general guidance on how to improve plant performance. Initially I was just going to introduce conventional models although based on a multi-mineral particle based system (what I call the 3D system).

There is one major problem with this approach. To get the data it would seem that a costly data sampling campaign would be required.

However I have previously worked on this problem at JKMRC; and the sub-problem is how to estimate the detailed data about a plant from data that is at a lower level of depth.

Relevant to this discussion, the simulation approach I have developed considers measurements as a manifestation of the real structure. So I use measurements to identify the detailed ore characteristics. This method was extended beyond what I did at JKMRC by using ore variability to advantage. I have patented this approach.

I presented this work at Las Vegas; and I can send those interested the link. I mentioned this method in Mineral Process Innovation and for a while I was overwhelmed with requests for the link.

So back to the problem at hand. I use mass balancing to infer information about ore at the appropriate level of depth. Furthermore the simulator has been developed so that the interface (as represented by the operating parameters) is customised for each client.

To give you an idea of how long it took me to develop the simulator. I think it took me about 1 month to develop the simulator but about 3 months to create the extensible interface. It has taken me another 3 months to develop the capacity to integrate ore variability. (The mass balance system could now be argued to be a 4D system - but at this level I call it MMPlantMonitor)

Now if a user uses the customised simulator (and mass balance system), information is recorded in a database about ore variability; this database therefore includes unit performance and operating conditions. Therefore if there is a history of operational changes then it is straightforward logistic regression to model changes in unit performance with operating conditions. In the absence of 'history' simple models are used to start off with.

The main deviation between my approach and convention is that conventional approaches are model-driven. That is great effort is used to construct a model and then the model is at best tweaked to satisfy measurements. In the last year I have been horrified to discover how many plants rely totally on these models and do not data collection whatsoever.

My approach is more data-driven. Hence I say my approach extends mass balancing to machine learning.

Needless to say there is a balancing act required between model-driven and data-driven. However any serious 'modeller' should be aware of the strengths weakness of the various approaches; and by 'modeller' I mean someone who wants to understand what they are doing/ not some computer junkie.

(unknown)
8 years ago
(unknown) 8 years ago

Mainly about simulation course. Last year I was invited to give a course on simulation in South Africa. This course has now been separated into Fundamentals Course and Advanced Course. In the Fundamentals Course I stick largely to conventional approaches to at least introduce the elementary concepts of simulation. In the Advanced course I introduce concepts that are not taught to mineral processors (either through graduate programs or publication) there are about 10 concepts that would be covered in this course that otherwise are not taught to mineral processors.

For you specifically, the course is very much workshop style. Your attendance would be an asset particularly to get a different perspective. I would certainly hope others from commercial simulation backgrounds will also attend.

(unknown)
8 years ago
(unknown) 8 years ago

It is good you are keeping this issue on the burner. As an old timer, let me say this straight. To a have model for a particular flowsheet, we must first identify the critical parameters to be measured, their effects on the flow rates/grades and recoveries have to be known. The instruments available to measure the parameters and their costs have to note. Having done these, we must know the design and operating variables one can manipulate in the operation of an operating plant. With a knowledge of all these, one must develop a model which includes all the parameters mentioned above. Such models are badly needed to be of relevance. Yes, if one can develop a model with minimum knowledge of the above but with theories and the help of mathematics, it would be a great contribution.

I want to add that we have to have a data bank of existing models and where they have been applied and the data from such use. Such an attempt would be a bench mark for future work on modelling.

I would be happy to get comments on my perception.

(unknown)
8 years ago
(unknown) 8 years ago

The idea of a database is a good idea. The trouble is that in general the mining industry is not very Collegian.

Just a note, not only is the software I developed extensible (meaning users can add their own models) I also create a Model Registration system whereby independent model developers can add their own models; and then they can either on-sell it to users or make it available for free. I am sure that with the inevitable migration to the 'Cloud' that there will be much greater opportunity for major advances in sharing.

Zander Barcalow
8 years ago
Zander Barcalow 8 years ago

I think models and measurements are very useful... I developed a model to predict the air slide temperature (air slide that carried dried concentrate) at a concentrator. This model allowed higher throughput in the drier.

Marshal Meru
8 years ago
Marshal Meru 8 years ago

You described at length the creation of your software, but I can only infer the answer to my earlier question: "Out of curiosity, in what ways(s) are you (or others) using a model to operate a plant?"

Based on the fact that you are creating steady-state mass balance models, I would infer that you are attempting to predict the results from making process changes. True?

Jean Rasczak
8 years ago
Jean Rasczak 8 years ago

I used minitab software. In this software I used the DOE function (historical DOE) with the historical data (data for a few years). I came up with a function that predicted the air slide temperature within + or - 5 degrees Celsius of the actual temperature (a sensor or temp probe was located in the air slide) . This permitted the operator to increase the temperature in the dryer (higher temp in the dryer translates to higher temp in the air slide) which means higher throughput in the dryer (higher throughput of conc being dried).

If the temp in the air slide is too high (higher than about 40 C), all the drying circuit stops. Thus, production stops.

If historical data is analysed using minitab (historical DOE is the term in six sigma), a fairly good transfer function can be obtained.

(unknown)
8 years ago
(unknown) 8 years ago

Yes that is right a simulator is an emulation of a plant. So if you want to predict the results of making a real operational change you simulate this by making an operational change in the simulator.

(unknown)
8 years ago
(unknown) 8 years ago

Fifteen years ago at my previous company, most of my work was of that type, using steady state mass balances. We also used steady state energy balances, which are critical in that, and most, industries.

Within Andritz, most of my work is with dynamic simulation. We almost always perform simulation with both mass and energy calculations. And we also do a large amount of steady state simulation. Most of our software users primarily perform steady state simulation with our IDEAS software, including feasibility studies, initial design, and what-if scenarios. S

(unknown)
8 years ago
(unknown) 8 years ago

The answers or solutions obtained from models have to be validated in the plant. Based on the outcome of the model, the engineer has to determine whether it is feasible. These soltuions will provide us with an idea of how to solve the problem in the plant. Then it is up to the engineer to perform the changes necessary (within reason) in the plant to implement the improvements.

(unknown)
8 years ago
(unknown) 8 years ago

Some answers to the historical questions gentlemen and another side of the technology that is more food for thought - not judging just a very interesting opposing view on the future:

http://www.abc.net.au/news/2014-11-06/kohler-algorithms-are-taking-our-jobs/5870662

(unknown)
8 years ago
(unknown) 8 years ago

I read the article, and then thought: "I bet it was written by an Australian". And indeed it was.

Many Australian workers are terrible at seeing a new technology and then immediately feel threatened: I never noticed this phenomenon until I was about 30 when I joined the Mining Industry.

I understand the word to describe people who are scared of technology is technophobic. ie. title of article: 'algorithms are taking a jobs' Not 'algorithms are helping us do our jobs more efficiently'.

So here is the issue, go to a Mining Company and say: 'we can use smart algorithms to improve plant performance'. Reaction: 'they are trying to sack us'.

This is why some Service Companies introducing new technologies may employ psychologists to assist in marketing.

(unknown)
8 years ago
(unknown) 8 years ago

The article was sent to me from home as an example for another discussion. The history comment was of interest but the usual rebuttal applies that technology creates as many work positions as it takes - and usually at a higher skill level. However the article was more supportive of algorithms than against the development.

(unknown)
8 years ago
(unknown) 8 years ago

Yes, no issue with the fact that technology is changing.

I am in other forums where 'Big Data' methods are discussed. Often discussions are quite heated.

Society is reaching a point as predicted by Isaac Asimov in the 1960s/70s (and I deeply regret not remembering the exact title) where the professional becomes an 'expert in running software' and cannot understand it from first principles.

For example, I was talking to my eldest son who is doing Biomedical Engineering and he mentioned he just did an exam for Process Control.

So I asked if it was all about PID controllers for which he agreed. So I asked whether he knew the difference between P, I and D. He responded that was not covered in the course.

Similarly my second son (Mechanical Engineering) used Simulink to simulate a braking system for a car.

I said that sounds very useful; but may be quite hard for first year. It turned out that he was given the code and just had to make sure it ran and vary some parameters to assess cause/effect.

The Mining Industry is already going down the path of 'Big Data Analysis.' In my view, unbalanced acceptance of new methods is as equally detrimental as immediate rejection.

What this means in practise is that Managing innovation by Mining Companies is going to be that much harder. The question is whether Mining Companies can evolve their Management structures to deal with the new innovation challenges.

Attachments area

Preview YouTube video What have the Romans ever done for us

Please join and login to participate and leave a comment.