Announcement

Collapse
No announcement yet.

Announcement

Collapse
No announcement yet.

AI technology is quickly getting out of hand

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • AI technology is quickly getting out of hand

    First the deepfake Taylor Swift porn and now this:

    Finance worker pays out $25 million after video call with deepfake ‘chief financial officer’.

    A finance worker at a multinational firm was tricked into paying out $25 million to fraudsters using deepfake technology to pose as the company’s chief financial officer in a video conference call, according to Hong Kong police.


    “Two things are infinite: the universe and human stupidity; and I'm not sure about the universe.”

    ― Albert Einstein

  • #2
    This, I'm afraid, is part of the "Brave New World". Kind of like nuclear power. You can use it satisfy the world's energy needs, or you can use it blow up stuff.

    Comment


    • #3



      Interesting videos generated from text inputs if I understand correctly.

      Every single video in this thread was generated by Open AI's new text to video model
      ---------------------------------------------
      Champagne for breakfast and a Sherman in my hand !
      ---------------------------------------------
      The Party told you to reject the evidence of your eyes and ears. It was their final, most essential command.
      George Orwell, 1984

      Comment


      • #4
        Originally posted by The Feral Slasher View Post
        https://twitter.com/sambhavgupta6/st...54106345439477


        Interesting videos generated from text inputs if I understand correctly.

        Every single video in this thread was generated by Open AI's new text to video model
        That big yellow face also maybe generated by sora, and it won't let me delete it ...the machines are winning
        ---------------------------------------------
        Champagne for breakfast and a Sherman in my hand !
        ---------------------------------------------
        The Party told you to reject the evidence of your eyes and ears. It was their final, most essential command.
        George Orwell, 1984

        Comment


        • #5
          61-y/o man arrested and beaten and raped in jail because AI facial recognition mistakenly ID'ed him as thief of sunglasses:



          Another disastrous, soul-chilling potential consequence of the Brave New World we're heading into.

          Comment


          • #6
            Not deathly chilling like my last post but here is another problem involving the use of AI:



            I know nothing about Grammarly but it sounds like this student was just using it as a spell-checker or punctuation-checker. There are turning out to be so many ways that most have little to no idea about that AI can and is being abused or misused that adversely effect innocent people.

            Comment


            • #7
              Originally posted by rhd View Post
              Not deathly chilling like my last post but here is another problem involving the use of AI:



              I know nothing about Grammarly but it sounds like this student was just using it as a spell-checker or punctuation-checker. There are turning out to be so many ways that most have little to no idea about that AI can and is being abused or misused that adversely effect innocent people.
              Grammarly is aggressively marketing to my university. They used to be mostly an editing help program, but to compete, they have added writing features like Chat GPT. Their primary consumer is the end user, the student, which is why they have added features to allow students to have the program write for them. When pressed on these, their answer to me, who they were marketing to, was that students are told in the Terms of Service not to use the tech in ways that are forbidden by their teachers, but there are no actual guardrails.

              Such programs are typically explicitly forbidden these days by those that think their usage undermines the learning intended for student, but it is pretty much impossible to police their usage. Turnitin is laughably inept and economically motivated to undercut competitors, while their parent company has its own tools students can use that don't show up in TurnitIn. Our lawyers have told us that there is no legal leg to stand on to use TurnitIn as "proof" against a student, a position that took them longer to come than it should have--TurnItIn has always been a flawed piece of software and an unnecessary and harmful gatekeeper integrated into many Learning Management Systems at universities. This student's grade appeal will overturn the failing grade, as it should, while other students will keep using generative AI to write their papers, as they should not, without penalty, because these detection shortcuts are flawed. It is a Brave New World.

              Comment


              • #8
                There will probably need to be a conversation about just allowing AI to help write students papers. Once this technology is embraced in the real world, schools are doing no service in stopping kids from using what they will likely be using in their professional careers. Teachers will have to determine just how much to allow and how they can adapt their lessons to this new tech.

                Comment


                • #9
                  Originally posted by ironfist View Post
                  There will probably need to be a conversation about just allowing AI to help write students papers. Once this technology is embraced in the real world, schools are doing no service in stopping kids from using what they will likely be using in their professional careers. Teachers will have to determine just how much to allow and how they can adapt their lessons to this new tech.
                  This is the ongoing conversation at the leadership level. I was at a state admin conference here in NC earlier this month and the AI session was standing room only. I spoke up to say as someone in the business world, I applaud people on my team who use tools to do their job better but that a tool is only as good as what is put into it. It isn't goingn to turn a shitty writer into Hemmingway, but the fear of the unknown is driving a lot of policy right now.

                  Comment


                  • #10
                    Originally posted by Moonlight J View Post

                    This is the ongoing conversation at the leadership level. I was at a state admin conference here in NC earlier this month and the AI session was standing room only. I spoke up to say as someone in the business world, I applaud people on my team who use tools to do their job better but that a tool is only as good as what is put into it. It isn't goingn to turn a shitty writer into Hemmingway, but the fear of the unknown is driving a lot of policy right now.
                    Yeah, we've seen what a disaster it has been for some companies that tried to use ChatGPT or other AI without really combing over what it output. Some really ridiculous results. But it's the future and we need to learn how to use it in the best way possible and to know what pitfalls to look out for.

                    Comment


                    • #11
                      Originally posted by ironfist View Post

                      Yeah, we've seen what a disaster it has been for some companies that tried to use ChatGPT or other AI without really combing over what it output. Some really ridiculous results. But it's the future and we need to learn how to use it in the best way possible and to know what pitfalls to look out for.
                      I don't disagree there is a place for that, but I teach writing, and especially undergrads, mostly. The fundamental question, for us, is do we want to have students produce good writing or become good writers? While I want the latter, I think for now even if you just want good writing, we need to teach the fundamentals and have students practice those to get a handle on them before they rely on AI. As John Ruskin said, and others have echoed, " The highest reward for a person's toil is not what they get for it, but what they become by it." Our learning objectives include helping students become critical thinkers, effective and discerning researchers, and effective communicators.

                      Generative AI is such a powerful tool, even now, that it can be used in ways that undermine those learning objectives. So, while I see a place in education for teaching students how to use such tools, I think that has to come after the fundamentals are learned, and even then, generative AI is a shortcut that can undermine real growth in the skills I've devoted my life to teaching. I appreciate that for those with other professional or educational goals, that is less of an issue, but in a writing class, I want students to write, from soup to nuts, without relying on such tools, although I do see a need for increased awareness and exposure so students know what generative AI does well currently and what it does poorly, and the reasoning for why we ask them to hold off on using such programs in our classes specifically. There is a faculty member in my program who follows these guidelines in our first-year writing class, but is developing an upper-level class focused on how to use these tools effectively. I'm looking forward to how that class develops--I am sure it will be popular--but I'm not currently in favor of such tools in our first-year writing class. Most of our students don't know the different between a credible and non-credible source or how to develop a good research question. Them using a tool like ChatGPT at this stage just leads them to trying to take a shortcut past learning these things to their detriment.

                      As a quick analogy, to me, having students use generative AI to write would be like an art class having students use generative AI to produce images. Sure, it can be done and in some contexts that is all that matters, but in an educational context, the learning of the skills matter. Learning how to use AI to write is not learning to write. Maybe for most we are headed to a point where learning to write well is not an essential skill, but I'd like to think it has value, just like painting realistic images still has value in a world where photography is quick and easy.
                      Last edited by Sour Masher; 02-22-2024, 01:01 PM.

                      Comment


                      • #12
                        I look at it more like using a calculator in math class. If you don't know the underlying principles then it's not helpful. You wouldn't want elementary school students using calculators when you're teaching basic math, but when you get to higher level math, it's unrealistic to expect students to be doing it by hand. And they certainly won't be doing it by hand in the professional world.

                        I certainly would not advocate letting students use it in a writing class. That's counterproductive, unless a portion of the class is devoted to writing with AI. You need to know how to write in order to use AI properly. Taking what AI spits out and publishing it is a recipe for disaster. A good foundation in writing is necessary to take the AI output and polish it into a useful end product.

                        Comment


                        • #13
                          Open-source video from text prompts

                          Sora (openai.com)
                          I'm not expecting to grow flowers in the desert...

                          Comment

                          Working...
                          X