We can make the AI slave, we just need the humans to be more slave-like to do it.
Just for information: We know, from multiple studies, that working more than 40 hours a week for longer periods of time is extremly unhealthy for you. A week has 24*7 = 168 hours and you should sleep 8 hours. That are 56 hours and if you’re working 60 hours, that leaves you with 52 hours or 7,5 hours per day for stuff like “commuting to work”, “buying groceries”, “brushing your teeth” , “family”, “friends”, “sport” or “this important appointment at the dentist”.
And that 7,5 hours are without a weekend. This will kill you. You might be younger and feel strong, but this will kill you.
Not to mention that it doesn’t yield higher output. So it’s stupid on every level.
7,5h per day is an absolute maximum for a standard workday. Crunches are sometimes fine if there’s a good reason, but they probably need to be followed by extended rest.
And if you want to have two weekends, 60 hours in 5 days is 12 hours of work a day, minus 8 hours for sleep you get 4 hours, minus ~2 hours commute you get 2 hours, and the rest is basic cooking and eating. This leaves 0 hours for anything else, including rest or even any other duties. This will absolutely kill you in the long run.
Just a few more hours bro.
If it’s within reach of a 60 hour week then it’s within reach of a 30 hour week.
This LLM copycat bullshit is never going to be it though. It’s not thinking, it’s looking up the answers at the back of the book.
wtf? why is everyone turning techbro all of a sudden even those who are supposed to be more knowledgeable on such stuff. Oh right because there is a bubble to sustain.
“Man who works 10 hours per year tells underlings to work 60 hours per week.”
Karoushi is the new cool trend. Everyone is doing it.
I highly recommend Kara Swisher’s recent book “Burn Book” for insights into the Tech lads like Brin, etc, as she’s known most of them since the 90s.
Really helps contextualize the crazy cocktail of engineering/commercial power with general naivety a lot of these guys have going.
So he’s saying they’ve exhausted the pool of applicants so badly to replace that with normal work weeks, just 150% amount of Googlers or maybe 200% amount of Googlers?
Power and fame break a man. Even if he wasn’t broken from the beginning.
He just wants more money and doesn’t want to pay his workers. Google has been laying off thousands of people in the last year, so there really is no shortage of applicants. They could have just kept their current workforce, maybe?
What I learned working with Googlers. They were dorks. Big ass dorks. Who got used by women because for the first time in their lives. They were attractive to these women. So many broken marriages and divorces from cheating husbands. That they joked about at the Christmas party. It was an eye opening experience.
Thought this was an Onion article!
Hey plebs! I demand you work 50% more to develop AGI so that I can replace you with robots and fire all of you and make myself a double plus plutocrat! Also, I want to buy an island, small city, Bunker, Spaceship, And/Or something.
Who gives a fuck what Sergey brin thinks
Specifically, the women who report to him that he’s attracted to need to spend 60 hours a week dating him.
Thanks for the input, Big Head.
AGI requires a few key components that no LLM is even close to.
First, it must be able to discern truth based on evidence, rather than guessing it. Can’t just throw more data at it, especially with the garbage being pumped out these days.
Second, it must ask questions in the pursuit of knowledge, especially when truth is ambiguous. Once that knowledge is found, it needs to improve itself, pruning outdated and erroneous information.
Third, it would need free will. And that’s the one it will never get, I hope. Free will is a necessary part of intelligent consciousness. I know there are some who argue it does not exist but they’re wrong.
The human mind isn’t infinitely complex. Consciousness has to be a tractable problem imo. I watched Westworld so I’m something of an expert on the matter.
Third, it would need free will.
I strongly disagree there. I argue that not even humans have free will, yet we’re generally intelligent so I don’t see why AGI would need it either. In fact, I don’t even know what true free will would look like. There are only two reasons why anyone does anything: either you want to or you have to. There’s obviously no freedom in having to do something but you can’t choose your wants and not-wants either. You helplessly have the beliefs and preferences that you do. You didn’t choose them and you can’t choose to not have them either.
I want chocolate, I don’t eat chocolate, exercise of free will.
By your logic no alcoholic could possibly stop drinking and become sober.
In my humble opinion, free will does not mean we are free of internal and external motivators, it means that we are free to either give in to them or go against.
Our universe is predeterministic. Block Universe (aka Eternalism).
Check mate.
It’s not according to quantum physics we observe
I want chocolate, I don’t eat chocolate, exercise of free will.
There’s a reason you don’t eat chocolate - likely health concerns or fear of weight gain. Your desire to stay healthy is stronger than your desire to eat chocolate. But you can’t take credit for that any more than you can blame an alcoholic for their inability to resist drinking.
I am curious to hear why you insist it’s inevitable. What intrinsic properties of the universe make you believe that we don’t have any choice and all our actions are set in stone?
What is inevitable? At no point have I claimed that our actions are set in stone. That would imply fatalism which equally suggest that things can happen without anything causing them to happen.
For me. I think everything is physical, and there’s always a cause and effect. There is no magical non-physical consciousness. A combination of your genetics, experiences, and environment determine the “choices” you make/actions you take. Free will is an illusion, IMO.
Your choice of words is an analytical failure it says that the the will somehow sitting on top of all those processes rather than being a function of them.
I don’t think my wording implies that the will is sitting on top of those processes, but rather that it’s an emergent property of them. You’re the one who’s implying a false dichotomy - just because our choices might be influenced by prior causes doesn’t mean we don’t have agency. I’m asking what makes you think our actions are predetermined, not what makes you think we have some kind of magical free will that defies causality. Can you actually address the question I asked, rather than nitpicking my phrasing?
If your choices are a function of prior events and an emergent property of complex but deterministic processes where does agency come in? We are a complex deterministic process that simulates our own self to both predict a much more complex unconscious self and write rules to influence it going forward.
We call this process being conscious even when its writing just so stories after the fact.
I guess we don’t need it then.
I don’t believe a single word of this bullshit.