pull down to refresh

Mr. Altman’s departure follows a deliberative review process by the board, which concluded that he was not consistently candid in his communications with the board, hindering its ability to exercise its responsibilities. The board no longer has confidence in his ability to continue leading OpenAI.

Too professionally vague to be interesting.

If you want to read a ton of discussion on this, the HN thread is pretty great is burning up.

reply

deleted by author

reply

Close! It is a site from 2004 iirc.

reply

I really like it. It's extraordinarily usable and fast, zero noise. Wish they supported markdown comments, though, apart from images.

reply

Also, unlike stacker.news, HN has an easy and obvious way to collapse threads. We should have that here too!

reply

The eye button at the right of every reply not easy enough?

reply

You can collapse them here in the upper right corner of the comment.

Is it not easy or not obvious?

reply

Like the eyeball on each comment?

reply

lol, did we all respond at the same time?

reply

Yes, yes we did lol

reply

deleted by author

reply

I'm with you there -- I used to have an app that I used that did dark mode. There are alternative front-ends to it that I periodically investigate. I just looked through a few and they don't have dark mode, but I'm sure one does someplace.

EDIT: here's one that looks nice.

reply

This is the one I used to use -- wonder why I stopped. Better interface than the above, and sweet sweet dark mode.

reply

deleted by author

reply

I'm with you there, too :)

reply

can’t recall any other examples of a ceo being removed at the pinnacle of perceived excellence… i wonder when the full story will come out.

reply

If he actually did something bad it's hard to imagine it won't be leaked.

reply

deleted by author

reply

my thoughts went straight to this as well. either this story or something similar nature but more recent and at the company.

very unusual to fire so suddenly.

time will tell.

reply
853 sats \ 1 reply \ @4 17 Nov 2023

I thought this was a Bitcoin Bugle post at first. Got me again!

reply

deleted by author

https://m.stacker.news/5163

Relevant, idk who Jimmy is but seems like he called it

reply

People say a lot of things about @sama but I expect innovation to slow down and diffuse if not halt at open ai.

reply

Yup, This! The innovators are always a bit out there!

reply

Looks like Sam just got a reminder of what ‘open’ really means.

He won’t make that mistake again.

reply

Tiny violin

reply

deleted by author

reply

Keep looking for the secret to life extension which they’ll never find lol

reply

I've been critical of OpenAI since ChatGPT was released, but nobody listened to my arguments why ML and LLM are not a path to AI. Now that Open AI is having problems, a bunch of people are suddenly critical of LLM and ML.

Most people are behind the curve.

reply

I assume you mean AGI? I don’t think so either. If we use humans as an example, we aren’t taught everything we know. We’re born with instincts that took millions of years for genetics to learn. AGI might require instincts in addition to this kind of probabilistic extrapolation.

I actually don’t know but that’s the most bearish case I can make for current AI techniques.

This isn’t my idea either. It’s something I heard Chomsky critique current ai with deriding it as empiricism.

reply
I assume you mean AGI?

Yes, but I should have been more clear. My critique of OpenAI is that it is a scam organization, because ChatGPT was always nothing more than a glorified toy, and yet OpenAI promised that it was a path to AGI. If OpenAI hadn't accepted hundreds of millions of dollars in donations, but maybe only a few million, I wouldn't be calling them scammers.

I suspect, but can't prove, that most of the hype around OpenAI was orchestrated by OpenAI's marketing department. If true, then OpenAI operated a lot like shitcoiners, using the revenue from their scam to pay unscrupulous people/websites to promote their scam, thus getting them more donations. An audit of OpenAI's financial transactions would prove or disprove my suspicion.

If we use humans as an example, we aren’t taught everything we know. We’re born with instincts that took millions of years for genetics to learn. AGI might require instincts in addition to this kind of probabilistic extrapolation.

Off the top of my head, the AI project closest to instinctual knowledge I can think of is Cyc, which is a collection of common knowledge. Allegedly it has proven useful, but so far not for making an AGI.

AI researchers have investigated almost every conceivable avenue for how to create AGIs. They tried to do ML decades ago, but only in the last few decades did we get machines fast enough and drives big enough to process and store the required data.

...that’s the most bearish case I can make for current AI techniques. [emphasis added]

I'm glad you put it that way. The current ubiquitous focus on Machine Learning is hampering the research and application of AI. Older techniques, such as expert systems, are ignored because they can't be scaled through just optimization and better hardware. And yet expert systems are useful right now (e.g. for helping doctors diagnose diseases), but they require people to put in mental effort to make them useful -- just like any other programming project.

It’s something I heard Chomsky critique current ai with deriding it as empiricism.

I haven't read his critiques, but I suspect I probably agree with most of what he has to say, since I also have a dim view of empiricism.


On a broader note, a lot of AI researchers are about to lose their jobs, and most AI job postings will soon vanish, not to be seen again for a decade. So begins the next in a long line of AI Winters.

reply

I suspect we’ll get AGI by combining older techniques like expert systems (to provide the instincts) with ML and whatever breakthroughs come after.

Have you been underwhelmed using ChatGPT? There are times when it really surprises me and saves me a lot of time (eg well scoped programming task) and others where it sucks. The times when it surprises me are pretty magical I have to admit.

reply

First female CEO of an AI company. Didn't last long...

reply
reply

deleted by author

reply

i think they meant that the female CTO is interim CEO now => first female CEO of an AI company

but not sure how to interpret the "didn't last long...". Did Sam Altman not last long?

reply

I thought this was from the bugle at first

reply

Oh, already several posts about this. Guess megathreads to bundle them would be cool

reply

Maybe this news means that Microsoft will buy OpenAi pre-market on Sunday night

reply

Could be.

It’s that, or massive data breach. Aligns with “building too fast” rhetoric.

Rumours circulating that Microsoft, their biggest investors, told staff to stop using it earlier this month, because of security concerns. Probably less likely tho

reply

"Just over a month ago, his sister came forward with allegations of him sexually abusing her when she was as young as 4 years old"

👀

reply

The new CEO, Mira Murati, is a statist. Yuck!

She is an advocate for the regulation of AI, arguing that governments should play a greater role therein.
reply

Shocker.

reply

hopefully this puts an end to the orb

reply

two separate companies. unfortunately I highly doubt this will impact his shitcoin at all. unless there's something more sinister beneath the surface than 'communication issues'

reply

The real ‘open’ AI company may well be born from this. Truly open source, built properly. Modular with easier training tools.

Sam and his future team have learnt a huge lesson the hard way.

Open wins. Not just by name.

reply

we dance on graves of shitcoiners https://m.primal.net/HQao.png

reply

Is there something deeper going on?

"OpenAI's President & Co-Founder Greg Brockman quits hours after Altman's firing" --> #320202

reply

same fate with the shitcoin worldcoin awaits...

reply

deleted by author

reply

Or as we say 'round these parts, "i haz fast draw"

reply

deleted by author

deleted by author