SimplML 4+
supervised numeric learning
differential enterprises
-
- Free
Screenshots
Description
All-numeric supervised machine learning ("tabular regression," in Apple CoreML parlance) via basis function methods, with model slicer visualization tools.
Now supporting binary classification problems.
See What's New In This Version to learn about all the latest updates!
This is a "no code" baseline app to serve as a test bed for basis function approximation methods, which are fairly simple (comparatively speaking) machine learning methods involving a single fully connected inner layer, and are usable for solving general smooth-ish function fits. There is no need to install a myriad of Python packages or other tools to use this. Simply download the app from the app store and read in your CSV file.
The legendary automotive miles per gallon data set which pre-dates Tensorflow and many of its users originating from one of the infamous "malaise eras" is uploaded here in the proper CSV format for this system as a tutorial data set for learning the system:
https://diffent.com/mpgfull.csv
Original data source & background:
https://www.tensorflow.org/tutorials/keras/regression
More detailed use instructions at https://diffent.com/simpleml.pdf
Features:
two basis function types (gauss & multi-quadric)
two distance metrics (Euclidean and Manhattan)
adjustable shape parameter
flexible on-screen data column selector
constant or variable basis function widths
some improper data flagging (question marks or non-numeric characters in data; however, scientific notation is supported e.g. 4.2e1
optional input variable normalization
model complexity hyperparameter adjustments (manual)
auto withhold of points for train/test split (user controllable amount via fraction of original points)
actual versus predicted plots for train, test, and out of sample evaluation points
actual versus predicted data exported to annotated columnar text files
plots in portable HTML for easy use in other systems
hyperparameters stored as user defaults
Variable importance is estimated by leaving a variable out of an already solved model
and comparing RMS error of training data set to full variable RMS error. If RMS doesn't change much,
the variable was not very important. Results reported out as (RMSchange/RMS - 1) per variable in
lower green output window.
User can apply this information to remove variables from models that are less important,
for improved out of sample performance.
What’s New
Version 5.9
Auto weight unbalanced binary output values in residual / sum of square error computation (for classification problems) based on the actual data imbalance ratio. For example, if you have 25 outputs at 1 and 75 outputs at 0 in a 100 point data set, the imbalance ratio is 3 to 1, so we weight the errors on the actual 1 values to be more important by a factor 3, so that the solver will focus on those better. This factor may be high, but we leave it there for now.
Allow the "max down count" that can occur before stopping is triggered during a fwd stepwise solve to be 10 (was 2).
Output the so-called precision of a binary outcome solve for the 1 target value during a stepwise solve. The following right hand side variables are integer counts.
float mlPrecision = (float)correctBinary1/((float)(correctBinary1 + (actualBinary0 - correctBinary0)));
Is this the only definition possible for "precision?" Doubtful. This is why we output the raw ratios (confusion matrix) for further analysis.
App Privacy
The developer, differential enterprises, indicated that the app’s privacy practices may include handling of data as described below. For more information, see the developer’s privacy policy.
Data Not Collected
The developer does not collect any data from this app.
Privacy practices may vary, for example, based on the features you use or your age. Learn More
Information
- Seller
- differential enterprises
- Size
- 658.9 KB
- Category
- Utilities
- Compatibility
-
- Mac
- Requires macOS 12.0 or later.
- Languages
-
English
- Age Rating
- 4+
- Copyright
- © 2023 diffent.com
- Price
- Free