Assaying, Microscopy, Mineralogy & XRF/XRD

Assaying, Microscopy, Mineralogy & XRF/XRD 2017-04-04T06:57:57+00:00
  • To participate in the 911Metallurgist Forums, be sure to JOINLOGIN
  • Use Add New Topic to ask a New Question/Discussion about Assaying, Microscopy, Mineralogy & XRF/XRD.
  • OR Select a Topic that Interests you.
  • Use Add Reply = to Reply/Participate in a Topic/Discussion (most frequent).
    Using Add Reply allows you to Attach Images or PDF files and provide a more complete input.
  • Use Add Comment = to comment on someone else’s Reply in an already active Topic/Discussion.

Automated Mineralogy Methods QEMScan (15 replies and 1 comment)

Helena Russell
2 years ago
Helena Russell 2 years ago

How much are automated mineralogy techniques applicable beyond mineralogy and how much of the use of automated mineralogy data ‘beyond mineralogy’ has been
to look at the rock properties that can be derived from having quantified the minerals present and their texture.  A case in point would be producing a ‘fracturing index’ for a rock mass – i.e. to predict how a rock will break. This can then be applied for example to:

Blast optimization – using the cuttings from drill holes to quantify the rock mass in 3D, and produce a fracturing model for a bench that can then be used to selectively load explosives to optimize the blast and fragmentation

Fracking – again using the cuttings from drilling to help understand the structure and physical properties of a rock mass and identify key areas for fracking (or equally reject areas that are not appropriate for fracking).

Bob Mathias
2 years ago
Bob Mathias 2 years ago

Currently I am returning to the issue of texture classification via a mathematical textural descriptor. I worked with a group who use hyperspectral data (visible to infrared). The resolution is about 30um which is certainly not as detailed as MLA/Qemscan; but is also sufficiently practical that most of the drill cores can be analyzed in entirety.

I am certainly interested in linking the data with MLA/Qemscan; and active plans are in play. The idea in this case is to use a mathematical descriptor of texture (the occurrence matrix, which is similar to mineral grain surface area association). This is used to identify which parts of the drill core are 'similar' so that if tests are applied on one drill core sample they can be inferred elsewhere.

A more detailed mathematical approach was developed based on wavelets. Another approach I developed was a simplified cracking model at JKMRC which was an extension of my PhD. My PhD approach was the basis of JKTexture which I understand this is now being credited to others. Any JKMRC/JKTech staff are welcome to discuss the current status. The idea was to develop a rules based system for how ores break, and was to replace the finite element approaches often being used - for which their weaknesses could be the subject of a major discussion.

The third approach I developed was the probability model which inherently used the manifested preferential breakage for a particular testcase, and was used to predictive what would happen if the feed texture changed incrementally. Obviously the third approach was more directly to be included in any simulation system.

From my viewpoint my various approaches were being explored within the AMIRA Geomet. Project, but because there was no plan to integrate this research with a simulation model it was a touch pointless.  Hence my point is that there are numerous ways to approach texture modelling for the purposes as discussed.

Therefore it can only really move forward successfully if all the approaches are considered together. As always it is virtually impossible to do any serious texture modelling without easily-accessible data (which I mentioned in an earlier discussion in Stereology/Mineralogy).

One of the advantages of the earlier mentioned data is it goes into a standard image format; for example I can easily read gigabytes of data using standard high-level software such as Matlab.

Alan Carter
2 years ago
Alan Carter 2 years ago

Years ago, my colleague and friend introduced me to X-ray mapping, and we used it to map the element distribution in experimental charges (small) and meteorites (parts of thin sections), and subsequently identify minerals. I have also used this approach at the Brookhaven SXRF/SXRD beam-line to map trace and major element distribution in thin sections. It was petro-graphy but more fun and of course not automated. These days I am more interested in measuring grain size and shape. We have used Matalab and digital photography for 2D analysis and would like to expand that to 3D. At this stage the whole process is not automated as we are experimenting with it but would like to reach there at some point.

Bob Mathias
2 years ago
Bob Mathias 2 years ago

By 3D I think you mean the stereological issue, if not please clarify. Hence the group: stereology/mineralogy is certainly a good forum to discuss. I will respond privately on this although others might want a more general discussion here. The stereological issue is one of the key reasons for why pixel data is required, rather then interpreted data.

Helena Russell
2 years ago
Helena Russell 2 years ago

It's really good to 'listen' from you guys. I am experienced with QEMSCAN since last five years in Tata Steel, India. Rather I look after the lab. Automated mineralogy technique is simple (except stereology part) but extremely versatile. If you have recognisable difference in average atomic number among phases (forget EDS spectrum) it can practically map any phase - natural mineral or other phases present in slag, sinter etc. But I feel less confidence on 3D stereology part because of in flexibility of choosing of model while processing the data. I don't know whether I am right.

Bob Mathias
2 years ago
Bob Mathias 2 years ago

I don't necessarily agree that the mineralogy technique is simple. When I was at JKMRC there were quite a large number of technical issues with the mineral identification algorithms, and I am unaware whether those issues were ever resolved. It is only when a third-party QAQC has been performed on mineralogical analysis that we can ever truly sign off on the adequacy of the methods. With reference to stereology, again we need some clarification. There are a variety of stereological approaches: Barbery & Leroux (modifications by Leigh); King & Schneider (with a modification by Spencer); Miller, Lin; Hill, Jones and Horton; and myself (including Keith).

One of the main difficulties in 'stereology' in the realm of mineralogy is the general mathematical weakness of mineralogists. I do not know how to say this any other way, and I apologize in advance. However in mineralogy their is the common use of jargon such as 'association', 'modal analysis' and 'CLY90' rather than using conventional statistical language such as mean, variance and covariance. Because of the use of 'jargon' it is almost impossible for mineralogists to utilize well-established mathematical principles. (I am not against the use of 'jargon' per say, only when the use of jargon means that conventional terms are not understood).

It would be a detailed review to discuss the various stereological approaches, but enough to say their are strong similarities between Barbery and his colleagues and my colleagues. We use the theoretical work of Davy (1984). Davy's work is very difficult to understand. I wrote a paper 1995 J. Microsc. in which these equations are derived and explained so that they are more directly applicable to particle section data. The equations are used to give the variance and covariance of mineral composition in particles.

So the first issue is whether these 'geometric probability equations' are correct. These equations have repeatedly been validated both practically and theoretically. I consider it a reasonable statement that they can be used with full confidence. In contrast some of the other stereological methods do not use the geometric probability equations, and might make rather specific assumptions, such as particle correspond to capped spheres. These authors can defend these approaches if they want.

The main difference between the Barbery approach and my approach is that Barbery assumed a 'texture model' - in his case a Boolean texture; whereas I used a non-parametric approach based on information theory (initially recommended by an excellent student - Jonathan Keith).

The method was extended to the full mineral distribution in particles - not just binaries. Now because of my direct involvement in the development of these algorithms I have strong confidence in them, but agree this would be an 'act of faith' by users. To dampen this 'act of faith' I suggested that the particle section to particle adjustment should be compared to the linear intercept to particle section adjustment - in which case it would certainly appear that the stereological adjustment is reasonable.  Such an approach was planned to be included on the MLA system - but it never eventuated. And this is partly the reason why I have gone independent.

In general I find that the various service groups are not interested in stereological adjustment. They might be interested in the intellectual engagement in discussing the approach but not the practical implementation.  The 'confidence' issue might be one reason - although I consider the lack of general mathematical understanding to also play a part.

However an important component remains that sophisticated stereological adjustment requires the pixel data rather then the interpreted data. I consider this to be the main obstacle - and has been since Barbery raised the issue before. Hence my position remains that if the pixel data were more easily accessible, the market would be able to develop the software necessary to fully utilise mineralogical data.

I do need to make the concluding comment that any service group interested, then I remain as always keen to see the approach implemented. I am sure once the benefits of high-level automated use of mineralogical data are evident the various techniques will become much more commonplace.

Also if any group is interested in me providing a course on advanced mineralogy (including stereology, liberation modelling, texture modelling, plant audit analysis, database techniques, and other mathematical approaches) I am more than happy to discuss.

John Koenig
2 years ago
John Koenig 2 years ago

I was feeling the same regarding the pixel data availability, accessibility to certain data, and ability to programme a software to utilize the mineralogy data. I am talking about QEM and IDiscover software I used. I see that you suggested using an access database but better than that it would have been wonderful if the data was kept in a GIS based database. In that way we could have used extensive querying techniques to filter the data when needed. With the limited queries in the software, I was very surprised to see the inability to filter particles with a certain element. Programming or automating the software for certain jobs is not also possible or very limited.

Bob Mathias
2 years ago
Bob Mathias 2 years ago

Now you are indeed talking about high-level, and this is exactly the discussion started in the Stereology/Mineralogy group on Texture modelling. It is very difficult to, say, do a thorough mineralogical analysis on drill cores (because of cost). The most ambitious project I am aware of is the BHP-Billiton Olympic Dam study. SGS (whom both of us previously worked for) was one of the major service groups for this project.

That is why I see it would be useful if we could relate mineralogical analysis data to lower resolution data such as Corescan via texture models (other suggestions welcome). And yes all the data needs to be connected to some high level system including connection to a geostatistical model (although Aykut's suggestion of a GIS model is presumably similar). This approach was definitely the vision of Guillermo Turner-Saad whom I worked for at SGS. Obviously this discussion could very well occur in the Geomet. group, but this group also appears to have temporarily died down. As always I remain very interested and definitely want to assist which ever group wants to pursue these activities.

Victor Bergman
2 years ago
Victor Bergman 2 years ago

Fully automated digital image analysis of particle size and shape in 2D is now a mature technique offered by many companies. I have personally dedicated a lot of my research energy to this topic during the last decades.  Since 2006 I have also been developing 3D techniques for the International Fine Particle Research Institute.

I expect to publish a review in "Stereology and Image Analysis" this year. FYI, a few practical data linked to the analysis of core chips has been published at the last World Congress on Particle Technology in Nürnberg. Pirard et al., 2010, 3D and 2D particle image analysis of rock chips generated by core scratch tests.

Tony Verdeschi
2 years ago
Tony Verdeschi 2 years ago

To add to this conversation, to support the 'myths-and-misunderstandings' theme. One must remember that we are attempting to bridge mathematical and descriptive characterization and even though earth scientists may - in general - be weaker at math than our engineering counterparts, one has to be careful in re-inventing the wheel, as it were. No amount of math is going to 'prove' a mathematical approach unless it is grounded in some empirical and rational evidence, not to mention standard approaches. Furthermore, a proper and rational understanding of the - seperate and interdependent - effects of sampling, stereology and accurate identification is the cornerstone of advancement.

Bad data cannot be stereologically corrected. Truth be told, THIS is why many people prefer to not 'correct' data, and anyone with a healthy dose of skepticism would be wise to follow the same adage.  Finally, at the 'business end', any modelling should, of course, be applied at the pixel level but, these days, with so many people with access to editing data at the pixel level, this can be a bit of a moving target. Standardization anyone?

 Also, to answer your original question, plenty of automated applications beyond 'mineralogy' are possible. I think the real answer is that the definition of 'mineralogy' is too restrictive.

Maya Rothman
2 years ago
Maya Rothman 2 years ago

Environmental Geochemistry - there was a good paper by Duncan Pirrie in 2009 on using auotmated mineralogy to understand the mineralogy of mining related waste from historic minelands and clean-up areas. XRF is used now and reclamation decisions are often based on total metals content rather than their actual bio-availability as would result from the toxic metals exposure and mineralogy. There are also numerous automated particle analysis which fits under this category of automation - metal inclusions, wear particles, tribology, cleanliness, foreign particle detection, gun shot residue, and more. Aspex http://www.fei.com/aspex-product-group/ has designed their equipment specifically to capitalize on these other applications.

Tony Verdeschi
2 years ago
Tony Verdeschi 2 years ago

The reference to 'simple', I think, stems from being able to devise and conduct a rapid test without the need for a detailed, overblown study. Contextually, simple does not necessarily mean superficial and/or invalid for reasons of statistical rigor. Unfortunately, simple is too often associated with simplified, the latter often leading to all sorts of error which, no matter how many tests are conducted to produce an apparently statistically valid data-set, can lead to misleading - or false-negative - results.

Al Cropp raises an important point: I believe the application to fragmentation is an area of rewarding potential; work at UBC in Vancouver sponsored by TeckCominco has been made and other work in Nottingham, I believe, is in progress and/or planned. In the 1980's, the comminution group in Utah (King & others) started the ball rolling on this and continuation of this has been sluggish and needs a kick-start! They looked into fundamentals of crack-propagation and other material micro-behaviours, the idea being to develop better modifications to the homogeneous breakage liberation models than were touted and theorized over at the time.

The comment covers the other 'beyond mineralogy' theme: Inclusion ratings in steel and other lengthly metallographic procedures could do with automation and there is scope for some cross-disciplinary benefit there as well. After all, numbers and sizes of inclusions per square mm in metals is a direct correlative of the problems of fine grained Au and PGM in ores and products!

JohnnyD
2 years ago
JohnnyD 2 years ago

Good discussions but still there is a big challenge on how this information can be captured in comminution as the aim of mineral liberation analysis is to enhance the comminution process. Gys and other researchers have played their great role but I think there still a lot to be done to certain the issue of mineral liberation particularly in comminution.

Tony Verdeschi
2 years ago
Tony Verdeschi 2 years ago

I believe the place to start is to data-mine heaps of existing information (or make a thorough, systematic study) such as the many-fraction, size-by-size liberation data QEMSCAN is now capable of doing. It will take lots of measurements, but this data can be examined to properly assess selection and breakage functions to put to rest the silly assumptions that have traditionally been made in liberation modelling. These include homogeneous breakage, conservation of interfacial area and all of the myths and misunderstandings that Gy addressed in his book. People have to pick up where these great researchers of the mid- to late 80's because technology has finally caught up to help. Even I have some ideas, so it cannot be difficult for some smart folk to take up this challenge.

Helena Russell
2 years ago
Helena Russell 2 years ago

I am happy that discussion is still ON. My experience says that we can do a lot of work which is beyond actual mineralogy. Latest development in pore calculation by QEMSCAN in well log etc can easily be used to calculate pore spaces in pellets, refractories etc. But I agree that still lot to be done in development front (software etc.). Why CSIRO/FEI are not thinking any other parameter/s to be added with BSE and EDS Spectrum to eliminate the overlapping situation while developing SIP. More complex phases are present more frequently overlapping situation happens.

Tony Verdeschi
2 years ago
Tony Verdeschi 2 years ago

Everyone understands the issues attached to SIP development. FEI are already taking this up and will be developing ways of dealing with overlaps by making tools that are equally useful in SIP development and validation. This is not only a QemSCAN issue, but also applies in making mixed spectra in MLA, where a created mixture 'steals' from a real mineral (or vice versa). The best ways of dealing with this are under active investigation, rest assured.

David
2 years ago

Automated Mineralogy...

Please join and login to participate and leave a comment.

 

BUY Laboratory & Small Plant Process Equipment

We have all the laboratory and plant equipment you need to test or build/operate your plant.

ENTER our Mining Equipment' Store

We Sell EQUIPMENT for all types of Mineral Treatment PROCESSES and Laboratory Testing needs

Have a Mineral Processing QUESTION?

Come in, ask your question

911Metallurgist Community Forums

Talk to other metallurgists and be helped.

Need ENGINEERING Services or Plant TROUBLESHOOTING?

We can IMPROVE ALL PLANTS / Mineral Processing Engineering & LABORATORY Ore Testing

911Metallurgy Engineering

Contact us for process engineering, metallurgical investigations, plant optimization, plant troubleshooting, needs. WE “FIX” METALLURGY.