Author Topic: AI Hello World  (Read 5913 times)

0 Members and 1 Guest are viewing this topic.

Online RoGeorgeTopic starter

  • Super Contributor
  • ***
  • Posts: 6202
  • Country: ro
AI Hello World
« on: August 15, 2022, 12:50:23 pm »
AI Hello World - Part 1

To install, open a terminal and type this:
Code: [Select]
sudo apt install python3 python3-pip
pip3 install torch torchvision torchaudio transformers

Try AI generated text with GPT-Neo-2.7B, a pretrained model.  The 2.7B stands for 2.7 Billion parameters.  At the first run, the code will auto-download the GPT-Neo-2.7B model, which is a 10GB download (by default downloaded in ~/.cache/huggingface/transformers/).  Once downloaded, it will all run entirely locally, no further downloads, no subscription and no Internet required.

To run it, open a terminal and type (only the text, without the  ">>> "):
Code: [Select]
python3
>>> from transformers import pipeline
>>> generator = pipeline('text-generation', model='EleutherAI/gpt-neo-2.7B')
           
>>> text_seed = "A horse, a frog and a giraffe walk into a bar."
>>> ai_answer = generator(text_seed, max_length=128, do_sample=True, temperature=0.7)     
>>> print(); print( ai_answer[0]['generated_text'] ); print()

>>> # press arrow up twice, then Enter, to re-run the ai_answer = ... and the print() ... lines again
>>> # if answers take too long, make max_length=50, for more creativity in answers try with temperature=0.9

>>> exit()
That's all.  From now on it can run offline any time.
\[ \star \ \star \ \star \]
Some examples of AI generated answers (on a i7/4GHz it takes about 1 minute to answer):

text_seed = "A horse, a frog and a giraffe walk into a bar."
Quote
A horse, a frog and a giraffe walk into a bar. The frog says, “It’s been a while since I’ve had a drink, but I’ll have one.” The giraffe says, “I’m going to have one, too.” And the bartender says, “How about a drink for the whole group?” The horse says, “Are you going to pay for this?” And the bartender says, “No, it’s on me.”
Wow, that was a disturbingly OK answer, WTF?! 8) and pardon my French.  Wonder what would happen if we try to troll the AI.  Let's write in French.  ;D

text_seed = "La mécanisation de l'agriculture a changé la vie de nos villages."
Quote
L'économie est devenue stratosphère, la croissance économique était née de la mécanisation de l'agriculture et de l'élevage, les salaires augmentent et les coûts de production sont plus bas.
Lol, nice usage of the "croissant" word :o, wonder who's trolling who here, or as we in Ro use to say, "Unde dai şi unde crapă." (meaning in Ro - "where you hit vs. where it splits").  Let's try the Romanian text, too, with the AI.

text_seed = "Unde dai şi unde crapă."
Quote
Unde dai şi unde crapă. În ciuda faptului că, în ciuda unui astfel de scop, ceva întâmplător se va întâmpla, cineva va întâlni, într-un post de televiziune, un politicien care-l va acuza pe un lider politic de corupţie, şi cineva va spune, în numele lui, ...
Meh OK, but the fact alone that it answered in plain Romanian felt uncanny already, and unsettling.  Though, is all these of any use, other than for entertainment that gets old quick?  Could be of use.  For example, it can "speak" Python, too, so GPT-Neo can be used as a programming companion:

text_seed = "def palindrome(string):     # returns the string backwards"
Quote
def palindrome(string):     # returns the string backwards in order
    for i, char in enumerate(string):
        if char!= string[-i - 1]:
            return string[:i] + string[-i - 1]
    return string

Usage:
print(palindrome("I ate a piece of pie yesterday")[::-1])

Output:
I ate a piece of pie yesterday
Note the usage example contains the pythonic way of reversing a string, "reversed = string[::-1]"  ;D

text_seed = "def bubble_sort(list):  #sort a Python list"
ai_answer = generator(text_seed, max_length=200, do_sample=True, temperature=0.9)
Quote
def bubble_sort(list):  # sort a Python list in ascending order
    for i in range(len(list)):  # for each item in the list
        if list > list[i+1]:
            list, list[i+1] = list[i+1], list

Here is a version that uses sorting.sort() and then loops over the sorted list repeatedly:
def bubble_sort(list):
    sorted_list = sorted(list)
    for i in range(len(sorted_list)):
Could be helpful.  :-+
\[ \star \ \star \ \star \]
Though, keep in mind that some answers might not be that good.  Sometimes it returns sloppy answers, or it starts repeating the same group of words over and over (increase the "temperature" if you get repetitions).

Anyway, the point is:   Installing and running an AI is easier than one might guess.  :D
\[ \star \ \star \ \star \]
AI Hello World - Part 2
https://www.eevblog.com/forum/programming/ai-hello-world/msg4365268/#msg4365268
\[ \star \ \star \ \star \]
« Last Edit: August 18, 2022, 12:15:35 pm by RoGeorge »
 
The following users thanked this post: evb149

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9890
  • Country: us
Re: AI Hello World
« Reply #1 on: August 15, 2022, 02:15:54 pm »
I have been tinkering around the edges of neural networks and have implemented the MATLAB digits recognition net and worked through building NVIDIAs demo projects.  Thanks for posting this, I'll have to give this a try as well.
 

Online RoGeorgeTopic starter

  • Super Contributor
  • ***
  • Posts: 6202
  • Country: ro
Re: AI Hello World
« Reply #2 on: August 15, 2022, 02:34:26 pm »
Interesting the digits recognition project.  :-+

Such an OCR might make an universal data logger, if instead of a static picture of digits would be fed with snapshots taken from a webcam.  Point the webcam to any DMM, or a water-meter/gas-meter around the house, and you have an universal logger, good to plot for example how a battery discharges with a DMM attached to the load, or the consumption of gas vs outside temperatures, or to alarm if the water-meter suddenly starts counting during night because of a pipe break, things like that.



LATER EDIT
=========
All the following is only a nice to have, not required.  To keep it simple, all these were not posted in the OP.

Since you mentioned nVidia, the GPT-Neo inference should run faster if you have CUDA.  There is a slightly different install command, so to inform PyTorch about that during install command:

Code: [Select]
# in a terminal, check CUDA type
nvcc --version

# if cuda not installed
sudo apt install cuda-nvidia-toolkit

# - visit pytorch.org to install PyTorch, it can prepare a custom install line for you, to run it in a terminal
        #    - scroll to the INSTALL PYTHORCH paragraph
        #    - complete the desired PyTorch build, os, etc., that will create the proper install command
        #    - copy paste the command in a terminal to execute it and install pytorch
        #
pip3 install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu115

        # my CUDA was v1.15, a version not listed in that webpage,
        #    so I've selected v1.13 then manually edit the above installing line from CUDAv1.13
        #    to v1.15 (change into 115 the number 113 at the end of the line)


Another thing you might want to do at install would be to use a Python virtual environment (venv), so to keep all the AI installs isolated from the rest of the Python.  If you didn't use venv before, the venv idea (Python virtual environments) might seem complicated.

It all runs the same, both the engine or the AI examples, just that it keeps the installs better organized, thought you have to remember each time to switch to that environment before running AI examples, then deactivate the environment to go back to normal.  Something like this:
Code: [Select]
sudo apt install python3-venv

cd your/preferred/sandbox/directory/

# create a new venv, here named 'envAI'
python3 -m venv envAI

# switch to it, and as long as it is active, usually it would be a "(envAI)" visible in the command prompt
source envAI/bin/activate

# from now on, any pip3 install will install python modules inside the 'envAI' Python instead of the default Python
# check what environment is activated, the path listed by the next command should end in 'envAI/bin/python3'
which python3

# now we can start install Python modules like PyTorch and Transformers in this isolated envAI location (AKA environment)
pip3 install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu115
pip3 install transformers

# note that the GPT-Neo model will download outside of 'envAI', the models are not pip3 installs.
# default model location is ~/.cache/huggingface/transformers, location can be changed if desired:
# https://stackoverflow.com/questions/63312859/how-to-change-huggingface-transformers-default-cache-directory

# all installed, now start a python3 and try GPT-Neo like in the python examples from the first post
python3
>>> # play with the python residing inside envAI
>>> # ...
>>> # done playing, now exit pytthon
>>> exit()


# to run other .py files using this particular python that now has the AI modules installed
python3 path/to/any/python3_program.py

# to deactivate our envAI python (AKA the envAI environment), and return to the default python
deactivate

# all back to normal, from now on any pip3 commands
#      will act only upon the normal (default) python,
#      leaving untouched our python setup and modules that were installed inside 'envAI'

# that's it about installing PyTorch and Transformer inside our 'envAI' virtual environment (venv)
# you can make another venv, then activate it and try there other version or combinations of modules than the ones inside 'envAI'
« Last Edit: August 15, 2022, 03:59:51 pm by RoGeorge »
 

Offline golden_labels

  • Super Contributor
  • ***
  • Posts: 1209
  • Country: pl
Re: AI Hello World
« Reply #3 on: August 15, 2022, 03:51:11 pm »
You may be already aware of that, but I feel obligated to say that in case you are not.

Keep in mind machine learning should not be perceived as anything more than glorified statistics, with resulting equations too complicated to be currently understood or verified. That means: what you get is probabilistic in nature and under the hood it’s just high-complexity maths.

It’s important, because people approach “smortnets” just like any other seemingly magic technology of the past and attribute properties it could never have. That leads to greatly overestimating abilities of those solutions at their current stage. On the other end there is the camp which perceives human brains as much more than they really are. ;)
People imagine AI as T1000. What we got so far is glorified T9.
 

Online RoGeorgeTopic starter

  • Super Contributor
  • ***
  • Posts: 6202
  • Country: ro
Re: AI Hello World
« Reply #4 on: August 15, 2022, 04:23:52 pm »
Agree, and all that applies to any other neural-network driven entity, too, including to us, the humans.  ;D
 
The following users thanked this post: karpouzi9

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9890
  • Country: us
Re: AI Hello World
« Reply #5 on: August 15, 2022, 05:21:50 pm »
My Linux desktop doesn't have a CUDA setup although I could change out the graphics card - probably will at some point.  It's a decent machine but getting long in the tooth.  I bought an HP Omen laptop with an NVIDIA RTX3080 for the specific purpose of running my MATLAB work with a little over 6000 CUDA cores.  But that's Win 11 and I really want to do this Python stuff on Linux.  I have to work on that...

I have run across the Python environment setup and I'm still working my way through it.  Incidentally, I find PyImageSearch.com a useful site.  Of course, most of the tutorials are 'for a fee' but education really isn't free anyway.

I have a JetBot fully assembled and working.  All I need to do is take a few hundred 'good path' and 'bad path' photos in order to train the network and have it follow a route with the camera.  Coming up next!

https://www.amazon.com/waveshare-JetBot-Kit-Jetson-Nano/dp/B082FNZ96R

I view Machine Learning (particularly the 'machine' part) as the next golden opportunity for EEs (and maybe CSs).  Large data is everything!

 

Online Ed.Kloonk

  • Super Contributor
  • ***
  • Posts: 4000
  • Country: au
  • Cat video aficionado
Re: AI Hello World
« Reply #6 on: August 15, 2022, 05:29:17 pm »
You may be already aware of that, but I feel obligated to say that in case you are not.

Keep in mind machine learning should not be perceived as anything more than glorified statistics, with resulting equations too complicated to be currently understood or verified. That means: what you get is probabilistic in nature and under the hood it’s just high-complexity maths.

It’s important, because people approach “smortnets” just like any other seemingly magic technology of the past and attribute properties it could never have. That leads to greatly overestimating abilities of those solutions at their current stage. On the other end there is the camp which perceives human brains as much more than they really are. ;)

If you want a software package that really is random and indeed has a mind of it's own, it's been around for 25 years: Win98.
iratus parum formica
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9890
  • Country: us
Re: AI Hello World
« Reply #7 on: August 15, 2022, 05:38:44 pm »
Keep in mind machine learning should not be perceived as anything more than glorified statistics, with resulting equations too complicated to be currently understood or verified. That means: what you get is probabilistic in nature and under the hood it’s just high-complexity maths.

Yes!  It may be hidden in the output layer but, for the digit recognition, what the CNN actually produces is the probability that an output is a particular digit.  It also produces probability for it being any of the other digits.

Let's say, after training, the recognizer is 98% accurate.  That means that 2% of the time, the CNN gets the wrong answer.  It isn't necessarily the fault of the CNN, some of the example digits in the dataset are truly unrecognizable, even by humans.  Handwriting isn't an exact art.

Partial derivatives aren't all that hard but thousands of derivatives with respect to thousands of inputs get a little mind-boggling.  What about billions of inputs?

I would really like to see Fortran code for CNN blocks and, in particular, how the back propagation works.

The problem with the MATLAB approach is that everything is canned into a simple statement.  Just add this statement and you have another layer.  I'm still working through it.

Code: [Select]



%Program to recognize digits using Deep CNN

%Giving path of dataset folder
digitDatasetPath='c:/Digits';

%Reading Digit Images from image Database Folder
digitimages=imageDatastore(digitDatasetPath,'IncludeSubfolders',true,'LabelSource','foldernames');

%Distributing images in the set of Training and Testing
numTrainFiles=750; %numTrainFiles=-.75  (75%)
[TrainImages,TestImages]=splitEachLabel(digitimages,numTrainFiles,'randomize');

layers=[
imageInputLayer([28,28,1],'Name','Input')

convolution2dLayer(3,8,'Padding','same','Name','Conv_1')
batchNormalizationLayer('Name','BN_1')
reluLayer('Name','Relu_1')
maxPooling2dLayer(2,'Stride',2,'Name','MaxPool_1')

convolution2dLayer(3,16,'Padding','same','Name','Conv_2')
batchNormalizationLayer('Name','BN_2')
reluLayer('Name','Relu_2')
maxPooling2dLayer(2,'Stride',2,'Name','MaxPool_2')

convolution2dLayer(3,32,'Padding','same','Name','Conv_3')
batchNormalizationLayer('Name','BN_3')
reluLayer('Name','Relu_3')
maxPooling2dLayer(2,'Stride',2,'Name','MaxPool_3')

convolution2dLayer(3,64,'Padding','same','Name','Conv_4')
batchNormalizationLayer('Name','BN_4')
reluLayer('Name','Relu_4')

fullyConnectedLayer(10,'Name','FullyConnected')
softmaxLayer('Name','SoftMax')
classificationLayer('Name','OutputClassification')
];

lgraph = layerGraph(layers);
plot(lgraph); %Plotting Network Structure

%------ Training Options ------
options = trainingOptions('sgdm',                       ...
                          'ExecutionEnvironment','auto',...  % or AUTO, GPU or CPU
                          'InitialLearnRate',0.01,      ...
                          'MaxEpochs',4,                ...
                          'Shuffle','every-epoch',      ...
                          'ValidationData',TestImages,  ...
                          'ValidationFrequency',30,     ...
                          'Verbose',false,              ...
                          'Plots','training-progress');

net = trainNetwork(TrainImages,layers,options); %Network Training
% analyzeNetwork(net)
YPred = classify(net,TestImages); %Recognizing digits
YValidation = TestImages.Labels; %Getting labels'
accuracy = sum(YPred == YValidation)/numel(YValidation); %finding accuracy
fprintf('Accuracy: %g%%\n',100*accuracy)


I am a long way from understanding the nuances of this code.  It is MATLAB examole code.  Just drop the layer components in place, set some parameters in the invocations and watch the show!

« Last Edit: August 15, 2022, 05:41:30 pm by rstofer »
 

Online RoGeorgeTopic starter

  • Super Contributor
  • ***
  • Posts: 6202
  • Country: ro
Re: AI Hello World
« Reply #8 on: August 15, 2022, 05:49:48 pm »
My Linux desktop doesn't have a CUDA setup ... I bought an HP Omen laptop with an NVIDIA RTX3080 ...

It doesn't need CUDA, that's only a nice to have, but it can run just fine without CUDA.  Another option if you want to have CUDA would be to run the AI examples on the Windows machine.  GPT-Neo should work on Windows just the same, but I didn't try.

To install all on Windows, first search online for how to install python3 and pip3 on Windows.

Once you have that, open a command prompt (cmd) and from the Windows terminal continue from
Code: [Select]
pip3 install torch torchvision torchaudio tensorflow
that would run on CPU, but since you want CUDA, type this other line instead:
Code: [Select]
pip3 install tensorflow torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu116then in the same cmd windows you installed, type "python", and you should see the >>> python prompt instead of the c:\ like it was during "pip install ...".

All the Python stuff and AI is the same in Windows, as it is in Linux. 
« Last Edit: August 15, 2022, 05:57:52 pm by RoGeorge »
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9890
  • Country: us
Re: AI Hello World
« Reply #9 on: August 15, 2022, 06:09:38 pm »
Thanks for the Windows setup info, it is appreciated.  In general, I don't like working at the Windows command line.  I much prefer the utilities provided by Linux.

One of my interests is in CUDA computing of ordinary matrix problems and Nvidia just happens to have a nice Fortran compiler in their SDK.  It's free as is the C++ compiler.  I've been writing Fortran off and on since '70, of course I am interested in Fortran code for the CNN blocks.

By setting the Auto option in the code above (Training Options -> Execution Environment), it will use the GPUs if available and the CPU if not.  For very small problems, the time difference is insignificant if not negative.  I am looking for larger examples that can run either way.  Just to see...  When a project like 'digits' only takes around 15 seconds to train, how much can you possibly save?
« Last Edit: August 15, 2022, 06:11:40 pm by rstofer »
 

Online RoGeorgeTopic starter

  • Super Contributor
  • ***
  • Posts: 6202
  • Country: ro
Re: AI Hello World
« Reply #10 on: August 15, 2022, 10:52:32 pm »
In general, I don't like working at the Windows command line.  I much prefer the utilities provided by Linux.

You may want to install Windows Subsystem for Linux (WSL), it brings Linux inside Windows, no dual boot.
https://docs.microsoft.com/en-us/windows/wsl/about
https://docs.microsoft.com/en-us/windows/wsl/install

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9890
  • Country: us
Re: AI Hello World
« Reply #11 on: August 16, 2022, 12:59:36 am »
I have used WSL off and on since it was first introduced.  I never got the feeling that workflow was as smooth as under a full Linux install.  Things have improved since I gave up on WSL, I need to take another look.

As I recall, getting to a graphics based text editor wasn't possible and I had to use vi or nano (no, I'm not going to get caught up in emacs).  Printing, since solved (apparently), was a big hole for those of us who want hard copy of our code.

Transferring files between Windows and WSL wasn't recommended and I suspect it still isn't.

 

Online RoGeorgeTopic starter

  • Super Contributor
  • ***
  • Posts: 6202
  • Country: ro
Re: AI Hello World
« Reply #12 on: August 18, 2022, 12:05:48 pm »
AI Hello World - Part 2

The code generating example from Part 1 was done with a generic model GPT-Neo-2.7B, yet the generated Python code was very promising as a programming helper (AKA code companion).

There is another model, 'unixcoder-base', specialized in generating code in 6 programming languages:  Python, Java, JavaScript, PHP, Ruby, Go.  This is smaller (480 MB of disk) and it responds faster than the GPT-Neo-2.7B model.  This is of BERT type (Bidirectional Encoder Representations from Transformers), trained with code, called CodeBERT, and I'm not very sure yet what all these means, but here's a code example to play with UniXcoder by yourself:  :D
\[ \star \ \star \ \star \]
Research papers:
(no idea if these are the main papers or derivative work)

CodeBERT: A Pre-Trained Model for Programming and Natural Languages
https://arxiv.org/pdf/2002.08155.pdf

UniXcoder: Unified Cross-Modal Pre-training for Code Representation
https://arxiv.org/pdf/2203.03850v1.pdf
https://paperswithcode.com/paper/unixcoder-unified-cross-modal-pre-training/review/

Source of this demo:
https://github.com/microsoft/CodeBERT/tree/master/UniXcoder
\[ \star \ \star \ \star \]
To install:
Apart from the install needed in Part 1, download this one more file (copy/paste the following line inside a terminal):

wget https://github.com/microsoft/CodeBERT/blob/master/UniXcoder/unixcoder.py

To run the demo:
In the same directory where the 'unixcoder.py ' filr was downloaded, create your own codingHelper.py, or just start 'python3' and type this, line by line at the ">>>" prompt inside of python3:
Code: [Select]
#!/usr/bin/env python3

import torch
from unixcoder import UniXcoder

device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model = UniXcoder("microsoft/unixcoder-base-nine")
model.to(device)

context = """
def f(data,file_path):
    # write json data into file_path in python language
"""

tokens_ids = model.tokenize([context],max_length=512,mode="<decoder-only>")
source_ids = torch.tensor(tokens_ids).to(device)
prediction_ids = model.generate(source_ids, decoder_only=True, beam_size=3, max_length=128)
predictions = model.decode(prediction_ids)

print(); print(context+predictions[0][0]); print()


The idea is to describe inside the multiline text variable 'context' what the code should do, in plain English, and the code will be generated for you.  At first run, the model will be auto-downloaded (about 480MB) and from there on it all works offline, no Internet required.
\[ \star \ \star \ \star \]
A few examples of AI generated code:

context = """
def f(data,file_path):
    # write json data into file_path in python language
"""
Quote
Code: [Select]
def f(data,file_path):
    # write json data into file_path in python language
    with open(file_path, 'w') as f:
        json.dump(data, f)


context = """
// read the command line argument in C language
"""
Quote
Code: [Select]
// read the command line argument in C language
//
// @param argc number of arguments
// @param argv array of command line arguments
//
// @return 0 on success, -1 on error
//int main(int argc, char **argv)
{
    int ret = 0;

    if (argc < 2)
    {
        fprintf(stderr, "Usage: %s <command> <arguments>\n", argv[0]);
        ret = -1;
    }
    else
    {
        ret = atoi(argv[1]);
    }

    return ret;
}
(The model used in this demo is 'unixcoder-base-nine', trained on 9 programming languages, including C, instead of only 6 for 'unixcoder-base')
\[ \star \ \star \ \star \]
Such results are absolutely jawdropping.  :o

And the model for 9 languages fits in half of GB, and it all runs on a desktop, and it answering in seconds, faster than it would take to type that amount of code by hand.  All these look as it would be some alien-grade technology from the future!  8)


Photo from:  https://www.mailplus.co.uk/edition/features/211532/revealed-after-32-years-the-top-secret-picture-one-mod-insider-calls-the-most-spectacular-ufo-photo-ever-captured  :scared:
« Last Edit: August 18, 2022, 12:25:03 pm by RoGeorge »
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9890
  • Country: us
Re: AI Hello World
« Reply #13 on: August 30, 2022, 03:37:08 pm »
So, I bought a copy of "Deep Learning with Python" by Chollet - it's a great book!

Digit recognition takes 15 lines of code and is before page 30 using Tensorflow and Keras.  The detailed discussion comes a little later but the code is pretty simple.

One thing I noticed, there is no specific input layer and no specific output layer, just two dense layers and that's it!  98% accuracy on the MNIST dataset.  The first layer has 512 outputs and the second layer just the requisite 10 outputs.

I'm using my Win 10 machine with SSH to connect to a Linux box in the back room.  Linux is just so much easier to set up and use.

From the book:

Code: [Select]
from tensorflow import keras
from tensorflow.keras import layers
from tensorflow.keras.datasets import mnist

(train_images, train_labels),(test_images,test_labels) = mnist.load_data()

model = keras.Sequential([layers.Dense(512,activation = "relu"),
                          layers.Dense(10, activation="softmax")
                         ])

model.compile(optimizer="rmsprop",
              loss     ="sparse_categorical_crossentropy",
              metrics  =["accuracy"])

train_images = train_images.reshape((60000, 28*28))
train_images = train_images.astype("float32") / 255
test_images  = test_images.reshape((10000,28*28))
test_images  = test_images.astype ("float32") / 255

model.fit(train_images, train_labels, epochs = 5, batch_size = 128)

print("\nRunning test data\n")
test_loss, test_acc = model.evaluate(test_images, test_labels)
print("\n")

Results:

Code: [Select]
Epoch 1/5
469/469 [==============================] - 1s 3ms/step - loss: 0.2589 - accuracy: 0.9250
Epoch 2/5
469/469 [==============================] - 1s 3ms/step - loss: 0.1047 - accuracy: 0.9694
Epoch 3/5
469/469 [==============================] - 1s 3ms/step - loss: 0.0683 - accuracy: 0.9799
Epoch 4/5
469/469 [==============================] - 1s 3ms/step - loss: 0.0503 - accuracy: 0.9847
Epoch 5/5
469/469 [==============================] - 1s 3ms/step - loss: 0.0379 - accuracy: 0.9888

Running test data

313/313 [==============================] - 0s 749us/step - loss: 0.0635 - accuracy: 0.9820

98% accuracy on the test data.

« Last Edit: August 30, 2022, 03:56:19 pm by rstofer »
 
The following users thanked this post: RoGeorge

Offline golden_labels

  • Super Contributor
  • ***
  • Posts: 1209
  • Country: pl
Re: AI Hello World
« Reply #14 on: August 30, 2022, 10:06:32 pm »
One of my teachers in college was often saying: the use of evolutionary algorithms or neural networks is an indicator someone has no idea how to solve a problem properly. This picture reminded me of that.

That was mid 2000s and since then much has changed thanks to arrival of cheap computing power. It’s no longer “shake the box with 50 parameters until it works”. Equally I do not find that picture outdated, akin to bringing up raw Tymoczko’s argument nowadays. But having that perspective from 2000s allows me to understand and appreciate the humor. :D

For the record, the author of this drawing is Randall Munroe and has been originally posted as xkcd: machine learning. It has been released under CC-BY-NC 2.5, please respect the license.
« Last Edit: August 30, 2022, 10:09:54 pm by golden_labels »
People imagine AI as T1000. What we got so far is glorified T9.
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9890
  • Country: us
Re: AI Hello World
« Reply #15 on: August 30, 2022, 11:46:06 pm »
I clipped it from a 3Blue1Brown video:



Time 9:45

I see it is also found at: https://xkcd.com/1838/ The license says I am free to copy and share the work as long as I don't sell it.

Rather than chase a license down a rathole, I deleted the post.
 
The following users thanked this post: RoGeorge

Offline golden_labels

  • Super Contributor
  • ***
  • Posts: 1209
  • Country: pl
Re: AI Hello World
« Reply #16 on: August 31, 2022, 03:36:55 am »
I believe there was some misunderstanding. I never asked for removal. I just asked if you could add a note on who is the author. There was no need to delete the post.  :'( And the reason I did that was not legal. The reason I asked for that was both to let people know who the author is, and to protect rights the author granted them.
People imagine AI as T1000. What we got so far is glorified T9.
 

Online RoGeorgeTopic starter

  • Super Contributor
  • ***
  • Posts: 6202
  • Country: ro
Re: AI Hello World
« Reply #17 on: September 05, 2022, 09:59:11 am »
... "Deep Learning with Python" by Chollet - it's a great book!

Digit recognition takes 15 lines of code

I'm using my Win 10 machine with SSH to connect to a Linux box in the back room.  Linux is just so much easier to set up and use.
...

Nice book, thanks for mentioning it.  :-+
About the setup, this can be simplified to NO setup at all.  ;D



Thanks to the free Google's Colab platform:
https://colab.research.google.com/

This short YT playlist introduces the (Jupyter Notebook like) Colab features:
https://www.youtube.com/playlist?list=PLQY2H8rRoyvyK5aEDAI3wUUqC_F0oEroL

All can be accessed for free, includding machines with GPU or TPU, no install required, only a webbrowser.  All is running at Google, and accessible through a Jupyter Notebook web page.  One would need some Google account to access the Colab machines.  In my case, Google just used my YouTube account I am always logged in anyway.

Files can be downloaded locally as .ipynb or as .py, or saved on Google Drive, etc.

Of course, one could install Jupyter Lab through Conda or Anaconda, and run all locally, but Google Colaboratory is great when all you have is a tablet with WiFi and a web browser, and want to do some AI stuff.  And with hardware GPU or TPU for free :o, that's very generous and great for learning.  Thank you Google!



For those not familiar with Jupyter Lab (formerly known as Jupyter Notebook) that's just like a Python IDE, where one can type and/or run python code through a webpage.  Pieces of codes are grouped in so called notebook "cells", which can be clicked and run in whatever order, or run them all one after another.
« Last Edit: September 05, 2022, 10:28:50 am by RoGeorge »
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9890
  • Country: us
Re: AI Hello World
« Reply #18 on: September 05, 2022, 07:30:30 pm »
I played with Google Colab for a short time and will come back to it when training time gets excessive.

I resolved the setup issues under Win 10 and Win 11 although I think I still have some work to do to get the GPU working on Win 11.  For my initial foray into ML, I decided to blow off the virtual environment stuff.  I'll probably regret that later on.

Did I mention that Win 11 is a true PITA?  I can't imagine how I forgot to mention that.  I put a few shortcuts on the taskbar...

The good news about code from the book is that it tends to work.  No typos that I have found and the results more or less match.

Another resource:  DeepLizzard.com.  They have a LOT of very short training videos.  Mostly the courses are for pay but that's not important.  I'm working through the "Deep Learning Dictionary (69 videos of around 3 minutes) and the explanations of the concepts are quite good.  The authors are wandering around south-east Asia (or they were when some of the videos were recorded) and they have covered a lot of ground; Thailand, Malaysia, Singapore (my favorite), Bali, Indonesia etc.  The videos are recorded in the hotel room with a laptop.

There are a number of free videos to get a feel for the content - it's VERY good!

What's with Thailand?  Jonas from VHDLWhiz.com is from Norway and living in Thailand.  When I was in Phuket many years ago, a motel room on the beach was $6/night in USD.  I suspect it has gone up a lot since the tsunami wiped out most of the beach area.  Urban renewal...

Ice was worth more than lobster...

 

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 3915
  • Country: gb
Re: AI Hello World
« Reply #19 on: September 07, 2022, 05:46:09 am »
One of my teachers in college was often saying: the use of evolutionary algorithms or neural networks is an indicator someone has no idea how to solve a problem properly.

lol  :-DD

well, applied to the universe, even black holes can be an evolutionary algorithm applied to the laws of physics. It makes new universes, with new physics laws, where who knows? intelligent life may emerge from silicon.

Evolutionary algorithms applied to biological neural networks of primates can demonstrate that soon or later a monkey will go down its tree and that's how a monkey will become "sapiens".

Evolution by DNA mutations, 4% of difference between a monkey and a sapiens over thousand and thousand million iterations.

Probably Einstein was wrong. "God" must be an A.I. that plays with the dice because has no idea how to engineer things properly  :-//
« Last Edit: September 07, 2022, 05:50:47 am by DiTBho »
The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14472
  • Country: fr
Re: AI Hello World
« Reply #20 on: September 07, 2022, 06:26:59 pm »
While I got relatively interested in "AI" when I was at uni (in mid 90's), so the topic in general has a lot of interesting approaches and applications, I have almost always failed to find any interest in neural networks. And they still do not interest me one bit, for the most part.

They are the epitome of using extravagant amounts of memory, processing power and data storage to solve problems while having no real clue how those get solved in the process.

But that approach seems popular lately, and not just with "AI", so. I guess we better get used to it, or something. :-DD
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19497
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: AI Hello World
« Reply #21 on: September 07, 2022, 07:54:09 pm »
While I got relatively interested in "AI" when I was at uni (in mid 90's), so the topic in general has a lot of interesting approaches and applications, I have almost always failed to find any interest in neural networks. And they still do not interest me one bit, for the most part.

They are the epitome of using extravagant amounts of memory, processing power and data storage to solve problems while having no real clue how those get solved in the process.

But that approach seems popular lately, and not just with "AI", so. I guess we better get used to it, or something. :-DD

They are indeed the latest silver bullet, with all that implies.

There are a number of key problems, including:
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 3915
  • Country: gb
Re: AI Hello World
« Reply #22 on: September 07, 2022, 09:28:05 pm »
I wrote a AI-NLP module to automatically identify and ban trolls from the letterbox of my website.

The best application ever because it saves my days from nasty people :D
The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9890
  • Country: us
Re: AI Hello World
« Reply #23 on: September 08, 2022, 02:31:16 pm »
I wrote a AI-NLP module to automatically identify and ban trolls from the letterbox of my website.

The best application ever because it saves my days from nasty people :D

Are you planning to distribute it via github or something?
I don't have any particular use for such a tool but I'm sure it would be educational.
 

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 3915
  • Country: gb
Re: AI Hello World
« Reply #24 on: September 08, 2022, 04:25:39 pm »
Are you planning to distribute it via github or something?
I don't have any particular use for such a tool but I'm sure it would be educational.

maybe in the future,  currently I don't want people analyze its code to exploit its vulnerabilities because it will vanish my effort. What I want is to keep letterbox space clean from nasty messages because I litteraly got bombarded  :o :o :o

The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf