Page 37 of 81 FirstFirst ... 27353637383947 ... LastLast
Results 361 to 370 of 804

Thread: *** VD 13 Commentary Thread ***

  1. #361
    Big Leaguer The Feral Slasher's Avatar
    Join Date
    Oct 2011
    Location
    Seattle
    Posts
    2,752
    Quote Originally Posted by Kevin Seitzer View Post
    I'm using TensorFlow for the first time today to build a neural network. Yay, me! I almost feel like a real data scientist.
    I thought I posted earlier that all the techno garble wasnt going to defeat me. But keep it up !!


    Ok what does that all really mean ???

  2. #362
    Big Leaguer Kevin Seitzer's Avatar
    Join Date
    Jan 2011
    Location
    Houston, TX
    Posts
    3,150
    TensorFlow is an open-source software library from Google. It can be used to build predictive models using neural networks. It's typically used for deep learning neural networks but can be used for a wide variety of machine learning and neural networks tasks.

    Explaining a bit what each of those things mean:

    Predictive model - for example, one could a build a model to predict the outcome of batted ball in play, based on inputs like the exit speed, the launch angle, and the spray direction of the batted ball. One might want to know the odds of a single, double, triple, home run, or out given those launch parameters.

    Neural network - a type of predictive model structure that has no particular theoretical knowledge about the subject matter it is working with. It is fed a set of training data with the answers provided, and from that training set it learns on its own the structure of the data, and iteratively constructs a network of weights between the inputs and outputs to try best match the desired output. This network of weights can then be used to provide prediction on fresh data. (This technique is so named because it was supposed to mimic the way neurons are connected in the brain.)

    Deep learning neural network - a neural network that has many "hidden" layers of weights between the inputs and the outputs. Until recently, computing power wasn't sufficient to build neural networks of this scale, and it was thought that such neural networks would be useless, anyway. Now, it has been shown the huge neural networks are very good at a diverse array of tasks (image recognition, speech processing, playing games, etc.). Handling the training of the neural network construction in a network of this size can involve a lot of complexity that wasn't present in much simpler neural networks.

    Machine learning - a way to build predictive models, of which neural networks are one type, where the computer learns on its own how to iteratively improve the predictive power of the model from training data
    "There was nothing for him to do under the truck, but it's tough to blame him now that he is dead." -V.Erps 3/26/2005

  3. #363
    Big Leaguer TS Garp's Avatar
    Join Date
    Jan 2011
    Location
    San Diego, CA
    Posts
    1,276
    Sorry, my afternoon got blown up. I'm headed home and will pick this evening.

  4. #364
    Big Leaguer The Feral Slasher's Avatar
    Join Date
    Oct 2011
    Location
    Seattle
    Posts
    2,752
    Quote Originally Posted by Kevin Seitzer View Post
    TensorFlow is an open-source software library from Google. It can be used to build predictive models using neural networks. It's typically used for deep learning neural networks but can be used for a wide variety of machine learning and neural networks tasks.

    Explaining a bit what each of those things mean:

    Predictive model - for example, one could a build a model to predict the outcome of batted ball in play, based on inputs like the exit speed, the launch angle, and the spray direction of the batted ball. One might want to know the odds of a single, double, triple, home run, or out given those launch parameters.

    Neural network - a type of predictive model structure that has no particular theoretical knowledge about the subject matter it is working with. It is fed a set of training data with the answers provided, and from that training set it learns on its own the structure of the data, and iteratively constructs a network of weights between the inputs and outputs to try best match the desired output. This network of weights can then be used to provide prediction on fresh data. (This technique is so named because it was supposed to mimic the way neurons are connected in the brain.)

    Deep learning neural network - a neural network that has many "hidden" layers of weights between the inputs and the outputs. Until recently, computing power wasn't sufficient to build neural networks of this scale, and it was thought that such neural networks would be useless, anyway. Now, it has been shown the huge neural networks are very good at a diverse array of tasks (image recognition, speech processing, playing games, etc.). Handling the training of the neural network construction in a network of this size can involve a lot of complexity that wasn't present in much simpler neural networks.

    Machine learning - a way to build predictive models, of which neural networks are one type, where the computer learns on its own how to iteratively improve the predictive power of the model from training data
    I really wish I was better at using computers and databases. I really struggle to find the work intersting so I avoid it. I think it wasn't until a few drafts ago that I learned how to use VLookup, and those was just for vintage drafting. I'm sure in the long run it would be less work for me to learn some new skills but I always end up just grinding thru it. maybe I should try to learn something new each draft.

  5. #365
    Big Leaguer hacko's Avatar
    Join Date
    Jan 2011
    Location
    Wadsworth , OHIO
    Posts
    1,798
    Quote Originally Posted by Kevin Seitzer View Post
    TensorFlow is an open-source software library from Google. It can be used to build predictive models using neural networks. It's typically used for deep learning neural networks but can be used for a wide variety of machine learning and neural networks tasks.

    Explaining a bit what each of those things mean:

    Predictive model - for example, one could a build a model to predict the outcome of batted ball in play, based on inputs like the exit speed, the launch angle, and the spray direction of the batted ball. One might want to know the odds of a single, double, triple, home run, or out given those launch parameters.

    Neural network - a type of predictive model structure that has no particular theoretical knowledge about the subject matter it is working with. It is fed a set of training data with the answers provided, and from that training set it learns on its own the structure of the data, and iteratively constructs a network of weights between the inputs and outputs to try best match the desired output. This network of weights can then be used to provide prediction on fresh data. (This technique is so named because it was supposed to mimic the way neurons are connected in the brain.)

    Deep learning neural network - a neural network that has many "hidden" layers of weights between the inputs and the outputs. Until recently, computing power wasn't sufficient to build neural networks of this scale, and it was thought that such neural networks would be useless, anyway. Now, it has been shown the huge neural networks are very good at a diverse array of tasks (image recognition, speech processing, playing games, etc.). Handling the training of the neural network construction in a network of this size can involve a lot of complexity that wasn't present in much simpler neural networks.

    Machine learning - a way to build predictive models, of which neural networks are one type, where the computer learns on its own how to iteratively improve the predictive power of the model from training data
    Oh yea that helps. I got it now.

  6. #366
    Big Leaguer Kevin Seitzer's Avatar
    Join Date
    Jan 2011
    Location
    Houston, TX
    Posts
    3,150
    Quote Originally Posted by The Feral Slasher View Post
    I really wish I was better at using computers and databases. I really struggle to find the work intersting so I avoid it. I think it wasn't until a few drafts ago that I learned how to use VLookup, and those was just for vintage drafting. I'm sure in the long run it would be less work for me to learn some new skills but I always end up just grinding thru it. maybe I should try to learn something new each draft.
    About a year ago, when I had an idea I might in the market for a new job soon, I started forcing myself to learn R. I made myself do simple stuff in R that was easier for me to do in Excel. At first it was hard, but now I'm to the point where R is second nature and it feels strange to make a chart in Excel. I'm glad I did it, but it was definitely uncomfortable at first. I've tried to keep pushing myself to keep adding new capabilities in R. Learning TensorFlow is just the next step in that. I could do my job without learning it. But I want to keep pushing myself. I'm not on the forefront of new technology like the kids coming out of school these days are, and it's harder for me to learn new things now than it was for me when I was younger. But if I keep trying to learn, I find it really pays off, both in my skillset and also in my attitude toward things, or whatever you want to call it.
    "There was nothing for him to do under the truck, but it's tough to blame him now that he is dead." -V.Erps 3/26/2005

  7. #367
    I have a bunch of old decades spreadsheets still in my Dropbox from way back when. I was looking through the ones I have (VDs 14-23 with a couple missing) and checking out some old picks. Unfortunately, several of them are missing the last couple rounds worth of picks so I can't tell who won. But there was one magical draft, VD 21, where I beat both the Robot and Johnny for one of only two or three titles I've ever gotten in all these gazillions of drafts.

    I picked 3rd (Big Trayn 4 Lyfe) and Johnny picked last. I remember thinking about how cruelly unfair it was that he got to see every last damn pick before he had to make his last. Fortunately he couldn't make up the difference and I won by 0.6 points, 191.9 to 191.3. If I recall, SeaDogStat, who went right before Johnny, had essentially the outcome riding on his second to last pick when he picked a certain unpicked player whose name rhymes with Shmellis Shmurks.
    New Rule: anyone that was cool with the GOP inventing $2 trillion out of thin air for freebies for people with yachts that have tiny yachts inside doesn’t get to demand how we pay for people who need chemotherapy treatments. --- Alexandria Ocasio-Cortez

  8. #368
    Big Leaguer
    Join Date
    Jan 2011
    Location
    San Francisco
    Posts
    3,599
    Quote Originally Posted by hacko View Post
    Oh yea that helps. I got it now.
    ok, so here's a little more chatty version of what KS said.

    Predictive model: I think what KS said is pretty intuitive. This is how most basic predictions have been made for most of your life. By looking at weather patterns from the past hundred years, we can guess that if there's wind coming out of the west and the temperature is dropping, snow is probably coming. If we've seen a million baseballs hit that left home plate at 100 miles an hour, at lots of different angles and in lots of different directions, we can predict what's going to happen to the next baseball to leave home plate at 100 miles an hour - maybe even if we've never seen this precise combination of angle and direction before, just because we know what balls that were very similar to it did. We might end up being wrong, but no matter what it'll be another data point and that means our next prediction is going to be even better.

    Neural network: Imagine that you've never heard of baseball. If I gave you all of the baseball scores from last year, and I told you who won each game - something like "Braves 7, Mets 3: Braves won", you might be able to look at all of those scores and figure out that the team with the higher number always wins. You can figure that out even if nobody ever tells you the rules of baseball, just the score and the winner. This isn't super interesting of course, but it's the simplest level.

    Deep learning neural network: And if I gave you the result of every at-bat, in order, from all of those games, you could probably even figure out that having a baserunner reach home plate is directly related to their team's score. And if I gave you the data about how the ball moved in each at-bat, you might even figure out that the team that hits the ball in the air the most times and has the ball bounce before it's caught tends to have the most baserunners reach home plate, and therefore gets the most points, and therefore wins the game. And again, you can develop all of that knowledge without anyone ever explicitly telling you the rules of baseball. But what you're doing in this case is using multiple layers of a neural network - the first tells you that a ball hit 105 mph with a 30 degree launch angle and at a 20 degree angle from home plate usually leaves the field, and the second tells you that when that happens, the batter usually runs all the way around the bases and reaches home plate, and the third tells you that reaching home plate leads to a higher score, and that leads to winning. And from that it can figure out that Mike Trout hits the ball like that a lot more than Alcides Escobar, and therefore having Mike Trout on your team is more strongly correlated with winning than having Alcides Escobar is, and now the computer can figure out which players make winning teams without ever knowing how baseball works. It might not always be right, but it's a way to make predictions and learn things from just the raw data.

    Another example, stolen from Wikipedia, is image recognition - if a computer wants to figure out what a picture is of, all it takes as input is each dot on the screen and what color it is. But it can figure out from those where the edges of the image are, and then it can figure out combinations of edges that make shapes, and then it can figure out that a particular combination looks like a nose or an eye or a mouth, and then it can figure out that those shapes in a particular combination make a face.

    (This is what I do at my job - we have a lot of historical data that tells us that a coffee shop usually sells things for a couple dollars, and the people who buy from it usually live somewhere nearby, and they usually make sales early in the day. Not all of those things are always true, and nobody had to sit down and tell the computer how coffee shops work, but our systems can figure out that if someone who lives in Omaha buys $400 worth of coffee from a coffee shop in Moscow at midnight, this is very unusual and maybe we should prevent that charge from going through.)
    Last edited by mjl; 01-12-2019 at 01:30 AM.
    In the best of times, our days are numbered, anyway. And it would be a crime against Nature for any generation to take the world crisis so solemnly that it put off enjoying those things for which we were presumably designed in the first place, and which the gravest statesmen and the hoarsest politicians hope to make available to all men in the end: I mean the opportunity to do good work, to fall in love, to enjoy friends, to sit under trees, to read, to hit a ball and bounce the baby.

  9. #369
    Holy shit. That is really fascinating.
    New Rule: anyone that was cool with the GOP inventing $2 trillion out of thin air for freebies for people with yachts that have tiny yachts inside doesn’t get to demand how we pay for people who need chemotherapy treatments. --- Alexandria Ocasio-Cortez

  10. #370
    Big Leaguer Kevin Seitzer's Avatar
    Join Date
    Jan 2011
    Location
    Houston, TX
    Posts
    3,150
    Quote Originally Posted by Bene Futuis View Post
    I have a bunch of old decades spreadsheets still in my Dropbox from way back when. I was looking through the ones I have (VDs 14-23 with a couple missing) and checking out some old picks. Unfortunately, several of them are missing the last couple rounds worth of picks so I can't tell who won. But there was one magical draft, VD 21, where I beat both the Robot and Johnny for one of only two or three titles I've ever gotten in all these gazillions of drafts.

    I picked 3rd (Big Trayn 4 Lyfe) and Johnny picked last. I remember thinking about how cruelly unfair it was that he got to see every last damn pick before he had to make his last. Fortunately he couldn't make up the difference and I won by 0.6 points, 191.9 to 191.3. If I recall, SeaDogStat, who went right before Johnny, had essentially the outcome riding on his second to last pick when he picked a certain unpicked player whose name rhymes with Shmellis Shmurks.
    I vaguely remember that draft.
    "There was nothing for him to do under the truck, but it's tough to blame him now that he is dead." -V.Erps 3/26/2005

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •