Transforming data into knowledge

19/07/2024 | 2 mins

By Doug MacLaurin

As artificial intelligence and machine learning tools become more ingrained in our everyday lives, the team at the UWA Data Institute is working to understand how the tools we increasingly rely upon to inform our decisions actually work.

Researchers are investigating and developing the use of deep learning and other data science innovations to transform data into knowledge that can be used to solve real world problems — from engineering and resource challenges, to modelling the spread of disease, to predicting and assessing mental health risks.

Professor Michael Small said it was critical to understanding the process of these neural networks — firstly by remembering that, unlike the name suggests, AI applications are not “intelligent” entities that supplant our own critical reasoning.

“Many of these neural networks are extensions of machine learning, mathematical modelling, data fitting, statistics, techniques that we do understand,” he said.

“Where these algorithm learnings become more complicated is when the amount of data going into the models is so vast — as is the volume of computation involved — that it’s not clear what part of the data is being used to make a prediction, or exactly how it’s making that prediction,” Professor Small said.

That’s why one area where deep learning models have achieved the most accurate results is in geoscience and resources where much of the measurable data is rock solid. As the CSIRO-UWA Chair of Complex Systems, Professor Small has been working with CSIRO Mineral Resources in developing such algorithms.

“If you extract a geological core from the ground, you can look at that, you can analyse it, do spectrographic tests on it to quantify aspects of it,” Professor Small said. 

“The predictions based on these samples are much better understood because they’re quantifying patterns and structure in complex but concrete data sets.” 

Michael Small

Image: Professor Michael Small, UWA Data Institute Director.


But Professor Small and his team have also yielded positive results when the variables being factored into the equation include human beings and their behaviour.

One long-term project at Perth Clinic has been collecting data from a large population over time via patient surveys.

Early versions of the algorithm developed from the data with the use of deep learning are being used at nurses’ stations to provide a helpful indicator about when a patient might be at greater risk of self-harm.

“It’s an example of the potential of these artificial neural networks to take in an enormous amount of data and identify patterns that can be used to improve — or even save — lives,” he said.

But it’s the AI tools the public are most familiar with that are arguably the most problematic when it comes to understanding how they use information to reach conclusions. Generative programs such as chatbots use data largely trawled from the internet — which is as much a source of misinformation, disinformation and bias, as it is verifiable facts.

It’s just one reason that it’s crucial to verify the results of generative tools and not to let the machine do all of the thinking.

Among many examples highlighting the risks are two recent incidents in the US and Canada, where lawyers used ChatGPT for legal research without verifying its results. In both incidents the chatbot provided cases to cite in court as precedents — it was soon discovered it produced cases that never existed.

Rather than such mishaps serving as examples of why the tools should be avoided, feared or considered “cheating”, Professor Small said they demonstrate the need to use the programs carefully and responsibly.

Addressing concerns about the widespread use of generative programs like chatbots at universities, he raises the comparison between how they are used in higher education to the way a calculator was used to replace the slide rule.

“There’s a generation of educators who said these calculators were evil and cheating, and no one’s going to know how to use a proper slide rule,” Professor Small said.

“Well, I’m a professor of mathematics and I don’t know how to use a slide rule — the tool has become obsolete.”

But, as he points out, you can’t do all of mathematics with a calculator.

“In the same way, a chatbot can help you formulate sentences or find information or spark ideas making it a great tool for getting over a writer’s block, but it’s not going to write the next great novel for you — at least, not yet,” he said.

Read the full issue of the Winter 2024 edition of Uniview [Accessible PDF 12MB]

Share this

Related news

 

Browse by Topic

X
Cookies help us improve your website experience.
By using our website, you agree to our use of cookies.
Confirm