b'Data trends Data trendsTim Keeping Associate Editor for geophysicaldata management and analysis technical-standards@aseg.org.auRegression, easy to do but difficult to interpret Figure 1.True vs predicted graph after running Matlab Regression Learner module with the Fine Tree option.It used to be a relatively simple request. Gary wants to investigate linear regression for radiometric data to try out Brian Mintys method for identifying naturally occurring high gamma emitting rocks from other sources. A simple linear regression model based on three variables each with 161 000 data points will provide a reasonable prediction model that when subtracted from Total Count (hopefully) leaves the unexplained anomalies. Similar to removing the regional in magnetics.Figure 2.Decision tree for three variables produced by Matlab Regression Learner module. This is what There are plenty of options at hand suchyou might get instead of equation coefficients in the future.as R, Excels long time add-in Analysis Toolpak and many languages haveb-tree search algorithm we learnt fromambiguity. Coarse trees produced 6353 inbuilt functions. Every option gaveDr Dobbs Journal in C programming.decision nodes and Fine Trees 41 961.roughly the same coefficients below toA giant if statement that adjusts itself make graphs similar to Figure 1. (balances) so searches start in the middleGary gets three coefficients he can use but 6.24K + 7.36U + 7.38Th = TC of the tree for the fastest average searcha model he cannot. How do I exchange time each run. machine learning models? How can I However, this was a chance to exploreexplain my thinking when I did not think machine learning because machineit? Will training models be the required learning is designed to solve this kindMachine learning calls them decisionfull waveform for any data set touched of problem. Out of 12 general options Itrees and nodes are determined byby AI in the future? For now we will discuss followed the default choice of Trees andbunching data into clusters by closenesswhether to stick with easily reproducible tried both Coarse and Fine Trees. Insteadof proximity resulting in the 41 961equations or the not so exchangeable or of the simple equation that impressesdecision nodes that dictate which clusterreproducible road of machine learning.pundits on a conference poster I receivedfuture data will assigned to. It sounds a trained model data object larger thanmore thorough than a general three my input data. coefficient equation, and it may well be,Referencebut according to my OReillys Hands-The trained model data object containson machine learning (Gron, 2019), byGron, A., 2019. Hands-on machine a trained regression model (Figure 2)letting machines build the decisions,learning with Scikit-Learn, Keras and and the original data to help calculatewe give up the ability to pinpointTensorFlow: concepts, tools, and predictions on new data. The regressionwhy a decision was made and addingtechniques to build intelligent systems model itself is a binary tree similar to themore variables creates more, not less,(2nd ed). OReilly.45 PREVIEW JUNE 2021'