Announcement

Collapse
No announcement yet.

Announcement

Collapse
No announcement yet.

*** VD 13 Commentary Thread ***

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by Kevin Seitzer View Post
    I'm using TensorFlow for the first time today to build a neural network. Yay, me! I almost feel like a real data scientist.
    I thought I posted earlier that all the techno garble wasnt going to defeat me. But keep it up !!


    Ok what does that all really mean ???
    ---------------------------------------------
    Champagne for breakfast and a Sherman in my hand !
    ---------------------------------------------
    The Party told you to reject the evidence of your eyes and ears. It was their final, most essential command.
    George Orwell, 1984

    Comment


    • TensorFlow is an open-source software library from Google. It can be used to build predictive models using neural networks. It's typically used for deep learning neural networks but can be used for a wide variety of machine learning and neural networks tasks.

      Explaining a bit what each of those things mean:

      Predictive model - for example, one could a build a model to predict the outcome of batted ball in play, based on inputs like the exit speed, the launch angle, and the spray direction of the batted ball. One might want to know the odds of a single, double, triple, home run, or out given those launch parameters.

      Neural network - a type of predictive model structure that has no particular theoretical knowledge about the subject matter it is working with. It is fed a set of training data with the answers provided, and from that training set it learns on its own the structure of the data, and iteratively constructs a network of weights between the inputs and outputs to try best match the desired output. This network of weights can then be used to provide prediction on fresh data. (This technique is so named because it was supposed to mimic the way neurons are connected in the brain.)

      Deep learning neural network - a neural network that has many "hidden" layers of weights between the inputs and the outputs. Until recently, computing power wasn't sufficient to build neural networks of this scale, and it was thought that such neural networks would be useless, anyway. Now, it has been shown the huge neural networks are very good at a diverse array of tasks (image recognition, speech processing, playing games, etc.). Handling the training of the neural network construction in a network of this size can involve a lot of complexity that wasn't present in much simpler neural networks.

      Machine learning - a way to build predictive models, of which neural networks are one type, where the computer learns on its own how to iteratively improve the predictive power of the model from training data
      "Jesus said to them, 'Truly I tell you, the tax collectors and the prostitutes are going into the kingdom of God ahead of you.'"

      Comment


      • Sorry, my afternoon got blown up. I'm headed home and will pick this evening.

        Comment


        • Originally posted by Kevin Seitzer View Post
          TensorFlow is an open-source software library from Google. It can be used to build predictive models using neural networks. It's typically used for deep learning neural networks but can be used for a wide variety of machine learning and neural networks tasks.

          Explaining a bit what each of those things mean:

          Predictive model - for example, one could a build a model to predict the outcome of batted ball in play, based on inputs like the exit speed, the launch angle, and the spray direction of the batted ball. One might want to know the odds of a single, double, triple, home run, or out given those launch parameters.

          Neural network - a type of predictive model structure that has no particular theoretical knowledge about the subject matter it is working with. It is fed a set of training data with the answers provided, and from that training set it learns on its own the structure of the data, and iteratively constructs a network of weights between the inputs and outputs to try best match the desired output. This network of weights can then be used to provide prediction on fresh data. (This technique is so named because it was supposed to mimic the way neurons are connected in the brain.)

          Deep learning neural network - a neural network that has many "hidden" layers of weights between the inputs and the outputs. Until recently, computing power wasn't sufficient to build neural networks of this scale, and it was thought that such neural networks would be useless, anyway. Now, it has been shown the huge neural networks are very good at a diverse array of tasks (image recognition, speech processing, playing games, etc.). Handling the training of the neural network construction in a network of this size can involve a lot of complexity that wasn't present in much simpler neural networks.

          Machine learning - a way to build predictive models, of which neural networks are one type, where the computer learns on its own how to iteratively improve the predictive power of the model from training data
          I really wish I was better at using computers and databases. I really struggle to find the work intersting so I avoid it. I think it wasn't until a few drafts ago that I learned how to use VLookup, and those was just for vintage drafting. I'm sure in the long run it would be less work for me to learn some new skills but I always end up just grinding thru it. maybe I should try to learn something new each draft.
          ---------------------------------------------
          Champagne for breakfast and a Sherman in my hand !
          ---------------------------------------------
          The Party told you to reject the evidence of your eyes and ears. It was their final, most essential command.
          George Orwell, 1984

          Comment


          • Originally posted by Kevin Seitzer View Post
            TensorFlow is an open-source software library from Google. It can be used to build predictive models using neural networks. It's typically used for deep learning neural networks but can be used for a wide variety of machine learning and neural networks tasks.

            Explaining a bit what each of those things mean:

            Predictive model - for example, one could a build a model to predict the outcome of batted ball in play, based on inputs like the exit speed, the launch angle, and the spray direction of the batted ball. One might want to know the odds of a single, double, triple, home run, or out given those launch parameters.

            Neural network - a type of predictive model structure that has no particular theoretical knowledge about the subject matter it is working with. It is fed a set of training data with the answers provided, and from that training set it learns on its own the structure of the data, and iteratively constructs a network of weights between the inputs and outputs to try best match the desired output. This network of weights can then be used to provide prediction on fresh data. (This technique is so named because it was supposed to mimic the way neurons are connected in the brain.)

            Deep learning neural network - a neural network that has many "hidden" layers of weights between the inputs and the outputs. Until recently, computing power wasn't sufficient to build neural networks of this scale, and it was thought that such neural networks would be useless, anyway. Now, it has been shown the huge neural networks are very good at a diverse array of tasks (image recognition, speech processing, playing games, etc.). Handling the training of the neural network construction in a network of this size can involve a lot of complexity that wasn't present in much simpler neural networks.

            Machine learning - a way to build predictive models, of which neural networks are one type, where the computer learns on its own how to iteratively improve the predictive power of the model from training data
            Oh yea that helps. I got it now.

            Comment


            • Originally posted by The Feral Slasher View Post
              I really wish I was better at using computers and databases. I really struggle to find the work intersting so I avoid it. I think it wasn't until a few drafts ago that I learned how to use VLookup, and those was just for vintage drafting. I'm sure in the long run it would be less work for me to learn some new skills but I always end up just grinding thru it. maybe I should try to learn something new each draft.
              About a year ago, when I had an idea I might in the market for a new job soon, I started forcing myself to learn R. I made myself do simple stuff in R that was easier for me to do in Excel. At first it was hard, but now I'm to the point where R is second nature and it feels strange to make a chart in Excel. I'm glad I did it, but it was definitely uncomfortable at first. I've tried to keep pushing myself to keep adding new capabilities in R. Learning TensorFlow is just the next step in that. I could do my job without learning it. But I want to keep pushing myself. I'm not on the forefront of new technology like the kids coming out of school these days are, and it's harder for me to learn new things now than it was for me when I was younger. But if I keep trying to learn, I find it really pays off, both in my skillset and also in my attitude toward things, or whatever you want to call it.
              "Jesus said to them, 'Truly I tell you, the tax collectors and the prostitutes are going into the kingdom of God ahead of you.'"

              Comment


              • I have a bunch of old decades spreadsheets still in my Dropbox from way back when. I was looking through the ones I have (VDs 14-23 with a couple missing) and checking out some old picks. Unfortunately, several of them are missing the last couple rounds worth of picks so I can't tell who won. But there was one magical draft, VD 21, where I beat both the Robot and Johnny for one of only two or three titles I've ever gotten in all these gazillions of drafts.

                I picked 3rd (Big Trayn 4 Lyfe) and Johnny picked last. I remember thinking about how cruelly unfair it was that he got to see every last damn pick before he had to make his last. Fortunately he couldn't make up the difference and I won by 0.6 points, 191.9 to 191.3. If I recall, SeaDogStat, who went right before Johnny, had essentially the outcome riding on his second to last pick when he picked a certain unpicked player whose name rhymes with Shmellis Shmurks.
                More American children die by gunfire in a year than on-duty police officers and active duty military.

                Comment


                • Originally posted by hacko View Post
                  Oh yea that helps. I got it now.
                  ok, so here's a little more chatty version of what KS said.

                  Predictive model: I think what KS said is pretty intuitive. This is how most basic predictions have been made for most of your life. By looking at weather patterns from the past hundred years, we can guess that if there's wind coming out of the west and the temperature is dropping, snow is probably coming. If we've seen a million baseballs hit that left home plate at 100 miles an hour, at lots of different angles and in lots of different directions, we can predict what's going to happen to the next baseball to leave home plate at 100 miles an hour - maybe even if we've never seen this precise combination of angle and direction before, just because we know what balls that were very similar to it did. We might end up being wrong, but no matter what it'll be another data point and that means our next prediction is going to be even better.

                  Neural network: Imagine that you've never heard of baseball. If I gave you all of the baseball scores from last year, and I told you who won each game - something like "Braves 7, Mets 3: Braves won", you might be able to look at all of those scores and figure out that the team with the higher number always wins. You can figure that out even if nobody ever tells you the rules of baseball, just the score and the winner. This isn't super interesting of course, but it's the simplest level.

                  Deep learning neural network: And if I gave you the result of every at-bat, in order, from all of those games, you could probably even figure out that having a baserunner reach home plate is directly related to their team's score. And if I gave you the data about how the ball moved in each at-bat, you might even figure out that the team that hits the ball in the air the most times and has the ball bounce before it's caught tends to have the most baserunners reach home plate, and therefore gets the most points, and therefore wins the game. And again, you can develop all of that knowledge without anyone ever explicitly telling you the rules of baseball. But what you're doing in this case is using multiple layers of a neural network - the first tells you that a ball hit 105 mph with a 30 degree launch angle and at a 20 degree angle from home plate usually leaves the field, and the second tells you that when that happens, the batter usually runs all the way around the bases and reaches home plate, and the third tells you that reaching home plate leads to a higher score, and that leads to winning. And from that it can figure out that Mike Trout hits the ball like that a lot more than Alcides Escobar, and therefore having Mike Trout on your team is more strongly correlated with winning than having Alcides Escobar is, and now the computer can figure out which players make winning teams without ever knowing how baseball works. It might not always be right, but it's a way to make predictions and learn things from just the raw data.

                  Another example, stolen from Wikipedia, is image recognition - if a computer wants to figure out what a picture is of, all it takes as input is each dot on the screen and what color it is. But it can figure out from those where the edges of the image are, and then it can figure out combinations of edges that make shapes, and then it can figure out that a particular combination looks like a nose or an eye or a mouth, and then it can figure out that those shapes in a particular combination make a face.

                  (This is what I do at my job - we have a lot of historical data that tells us that a coffee shop usually sells things for a couple dollars, and the people who buy from it usually live somewhere nearby, and they usually make sales early in the day. Not all of those things are always true, and nobody had to sit down and tell the computer how coffee shops work, but our systems can figure out that if someone who lives in Omaha buys $400 worth of coffee from a coffee shop in Moscow at midnight, this is very unusual and maybe we should prevent that charge from going through.)
                  Last edited by mjl; 01-12-2019, 12:30 AM.
                  In the best of times, our days are numbered, anyway. And it would be a crime against Nature for any generation to take the world crisis so solemnly that it put off enjoying those things for which we were presumably designed in the first place, and which the gravest statesmen and the hoarsest politicians hope to make available to all men in the end: I mean the opportunity to do good work, to fall in love, to enjoy friends, to sit under trees, to read, to hit a ball and bounce the baby.

                  Comment


                  • Holy shit. That is really fascinating.
                    More American children die by gunfire in a year than on-duty police officers and active duty military.

                    Comment


                    • Originally posted by Bene Futuis View Post
                      I have a bunch of old decades spreadsheets still in my Dropbox from way back when. I was looking through the ones I have (VDs 14-23 with a couple missing) and checking out some old picks. Unfortunately, several of them are missing the last couple rounds worth of picks so I can't tell who won. But there was one magical draft, VD 21, where I beat both the Robot and Johnny for one of only two or three titles I've ever gotten in all these gazillions of drafts.

                      I picked 3rd (Big Trayn 4 Lyfe) and Johnny picked last. I remember thinking about how cruelly unfair it was that he got to see every last damn pick before he had to make his last. Fortunately he couldn't make up the difference and I won by 0.6 points, 191.9 to 191.3. If I recall, SeaDogStat, who went right before Johnny, had essentially the outcome riding on his second to last pick when he picked a certain unpicked player whose name rhymes with Shmellis Shmurks.
                      I vaguely remember that draft.
                      "Jesus said to them, 'Truly I tell you, the tax collectors and the prostitutes are going into the kingdom of God ahead of you.'"

                      Comment


                      • Another aspect of machine learning is a "genetic algorithm", where you basically have a strategy with a bunch of possible choices in it, you try one, and if it did well (according to whatever way you decide to score) it gets a little more likely to get reused in the future. And then you add some random mutations each time through to jumpstart new development. Here's a five-minute video that talks about a system that learned to play Super Mario pretty quickly with no understanding of how the game works beyond "the idea is to get to the right side of the screen":

                        Last edited by mjl; 01-12-2019, 01:20 AM.
                        In the best of times, our days are numbered, anyway. And it would be a crime against Nature for any generation to take the world crisis so solemnly that it put off enjoying those things for which we were presumably designed in the first place, and which the gravest statesmen and the hoarsest politicians hope to make available to all men in the end: I mean the opportunity to do good work, to fall in love, to enjoy friends, to sit under trees, to read, to hit a ball and bounce the baby.

                        Comment


                        • Originally posted by Bene Futuis View Post
                          Holy shit. That is really fascinating.
                          It truly is but unfortunately with my 62 year old brain all I get is :
                          Brain: Damn you are really fucked at this.
                          Me: hey I finished 3rd once.
                          Brain:. What everyone fall a sleep at the wheel or was it a mercy game .
                          Me : I have SOME skills.
                          Brain: Being a Designer for women cloths for the Amish is not much a skill.
                          Me : I recruited Ken who might help get a App/ website going for this so we have more of these before I die.
                          Brain: You recruited someone else who will kick your Ass. Next time think Wayne’s world dumbass.
                          Me: I am fucked!

                          Comment


                          • I might have accidentally deleted someone's post on mobile. Sorry. I'll fix later if so.

                            Comment


                            • I won't be picking super fast, just in case anyone is thinking they need to make a rushed pick. I need to put some work into the as today, and then think about my next two picks.

                              Comment


                              • Work into the ss (spreadsheet) **

                                Comment

                                Working...
                                X