Author Topic: Shane Wighton explains machine learning  (Read 2807 times)

0 Members and 1 Guest are viewing this topic.

Offline golden_labelsTopic starter

  • Super Contributor
  • ***
  • Posts: 1286
  • Country: pl
Shane Wighton explains machine learning
« on: May 27, 2023, 07:30:00 pm »
The shortest and easiest introduction to smortnets I have encountered so far. From Shane Wighton. Timestamp relevant: starts around 10m38s.



If you want to extend this to models like ChatGPT, imagine that:
  • The robot looks not at the end of the last stroke, but at 1000 previous strokes.
  • There is 150 billion knobs.
  • The training set included not hundreds examples, but something on the scale closer to everything humanity ever produced.
  • It was fine-tuned to make results look as deceiving to humans as possible, not merely be similar to the learning set.
« Last Edit: May 27, 2023, 07:32:31 pm by golden_labels »
People imagine AI as T1000. What we got so far is glorified T9.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf