1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Pcn For A Ducati Van?

Discussion in 'Lounge' started by J0n3s, Jul 1, 2024.

  1. What's changed in AI is the ability to process and apply huge amounts of data, that's where the LLM (Large Language Model) comes in. For decades companies have collected enormous amounts of data but had no way of meaningfully processing it. That's now changed.

    AI can extrapolate, only based upon previous points, much like a rockets trajectory.
     
    • Agree Agree x 1
  2. About 12 years ago I had an American girlfriend who was over here studying for an MBA at the London annex of an American uni. They gave students access to past papers for revision/practice, so I took one online just for shits and giggles and although the only prep I did was to read the relevant chapter of one of her textbooks, I got c.80%! The American education system is notoriously poor, but I was shocked at how easy the test was. I’d say it was about as challenging as a UK A-Level.
     
    • Like Like x 1
  3. Read an article a few weeks ago about exam cheating where students were using ChatGPT to answer questions, and the answers were very good.

    The challenge for students is the same prompt will give the same answer, or at least very similar. All ChatGPT has done is take keywords in the sentence and matched it to millions of similar questions others have asked and the answers.

    AI can look very dangerous if you're not knee deep in the challenges it's supposed to solve (and doesn't)
     
    • Like Like x 1
  4. I’ve never used ChatGPT and will try to avoid having anything to do with it until it becomes impossible to do so, as I don’t want that thing knowing anything about me that it hasn’t already scraped from my publicly available digital footprint.
     
    • Agree Agree x 1
  5. I'll give a more real world example. One of my companies makes video games, well you could consider that's a prime example where generative AI can be used to create the art, models, characters, scenery etc. Many video game artists are worried. However, one of the challenges is, who owns the gen AI output, the technology who makes the gen AI system or the person who writes the prompt? And that's just for starters, if I use a prompt to create a character in my game, what is stopping someone else using the same prompt and creating the same character? who owns it then? does it mean two people own the character, the first, or none.

    AI for commercial purposes is a can of worms and many companies will steer wide of it for a long time.
     
    • Like Like x 1
    • Agree Agree x 1
    • Useful Useful x 1
  6. For the last 10 years or so I worked with 'old school' AI as used in an optimised network path application and as a predictor of network usage.

    In the latter case large amounts of data is used as input to an AI function that is directed by code to process in a particular way and a resulting (model) result created which can then be used in the next iteration of processing the same data and so on for many many iterations i.e 10s of thousands. Each result could then be checked against a desired 'accuracy' score and repeated to achieve a better accuracy or stopped as necessary. The winning model could then be used against any similar data set.

    From the outside this may look like learning, and perhaps it is how we learn, but there is no self direction or more importantly insight going on. I am prepared to be corrected here but I understand the 'new' AI is formulated to work in a similar way but because of it's speed it's code can be a little 'looser' in it's definition. This in turn can help a (human) user to self direct their own learning.

    In the former case it was possible to throw a random variable into one of the processing iterations that could be thought of as insight but it could just as easily lead it away from an acceptable solution as to find the optimum result. But it was still the same, process the data, check the score, use that result to process again etc etc.

    As I think I mentioned in another thread reply on this topic quantum computing will be the next big step in increased computing speed which in turn will assist in the speed of AI when working on more complex problems e.g. 'creating' music, literature etc.
     
    • Like Like x 1
  7. I read recently that an AI was set a task using prompts in a variant of Bengali and, unexpectedly, it taught itself the entire language in order to complete it. That strikes me as potentially emergent behaviour and is extremely concerning, because emergent behaviour is unpredictable and not amenable to outside control. Even if that sort of outcome isn’t a true emergent phenomenon and is merely a sophisticated simulacrum, given the pace at which tech develops, I suspect it won’t be long before AI is exhibiting genuinely emergent behaviour.


    Another thing which worries me is that AI is learning from information on the internet and as everyone who has been on social media knows, that information doesn’t represent the best version of humanity.
     
    • Like Like x 1
  8. Reminds me of an amusing article I read recently, a social media influencer had created an AI version of herself to allow increased interaction with her fan base, it was eventually turned off as the AI started engaging in conversations of a sexual nature and even initiating them.
    It provides an alternative to Skynets endgame in Terminator2, rather than destruction of the human species maybe we all just end up getting hassled by some lecherous supercomputer.
     
    • Funny Funny x 3
Do Not Sell My Personal Information