NVIDIA CEO Jensen Huang Returns to the Birthplace of the GPU: Denny’s

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,391
Points
83
What is Jensen Huang's favorite place to eat? Denny's, possibly, as NVIDIA has shared a new article about how its CEO showed up at one of the franchise's diners in Silicon Valley yesterday to revisit the location where Huang and other NVIDIA founders came up with the revolutionary idea of a graphics processing unit (GPU). Huang, who arrived in a leather jacket, was joined by Kelli Valade, Denny CEO, to unveil a plaque celebrating "the booth that launched a $1 trillion company," while the Denny’s Trillion-Dollar Incubator Contest—a contest offering $25,000 in seed money for the next $1 trillion idea—was also announced. “We make the best pancakes here,” Huang, who had his first hamburger and milkshake at Denny's, said.

See full article...
 
Uhh.

Nvidia is claiming to have invented the GPU? Lol.

There were precursors to GPU's in arcade games already in the 1970's

Consoles used various forms of graphics accelerator chips throughout the 80's and 90's long before Nvidia was founded in 1993. Commodore was huge on putting these dedicated processors in their Commodore and later particularly Amiga computers in the late 80's and early 90's

Companies such as S3, Tseng Labs, Matrox and 3DFX all had GPU's for PC's before Nvidia did.

Let's not rewrite history here. Heck, ATi (the precursor to AMD's graphics business if any young'uns are reading) had their "Wonder" GPU on the market in 1986 and "Mach" GPU in 1991.

Stop trying to rewrite history Nvidia. You were a follower, not a leader in the early days.
 
Last edited:
Yea that's some bullshit revisionism. And it's not the first time Nvidia claimed that title either.
 
I don't recall nVidia being a big player at all until they swallowed 3fdx. As Zath said, S3/Matrox/etc where huge back in those days...
 
Aw sh1t, are we so old that now we gotta explain who ATi is?!?!?!?!?!?!

Yeah, unfortunately.

A lot of the "gamer kids" are like 12-14 when they get started. That means they were born in like 2009 to 2011. ATi was sold to AMD in 2006.

I was born in 1980. It would be like someone expecting me to be familiar with 70's mainframe component manufacturers.

Saab-Univac anyone? :p

To be fair though, not many of the kids in that age group participate in web forums. They go to social media, Tik-Tok, Youtube, Twith and Discord for their information. They find reading and writing to be a chore :/
 
Aw sh1t, are we so old that now we gotta explain who ATi is?!?!?!?!?!?!
Pretty soon we'll be having to explain to them what a CD is ... you know, Compact Disk, not Certificate of Deposit ... let alone cassette tapes, 8-track tapes, flip top cans, and rotary or pay telephones.
 
I guess we need a history of computers in schools now, seriously, computer history should be a taught subject.
 
I guess we need a history of computers in schools now, seriously, computer history should be a taught subject.

Maybe. I think we have to try to think unbiased about it though.

We have an affinity for this old tech because we lived through it, but how much does understanding it actually benefit students?

There are certainly going to be aspects of the tech that have "foundational building block" type of qualities that help them understand how the world works, but there are going to be other aspects of it that are arcane and completely useless to them, like how we don't really need or care to know the specifics of how to operate a Telex machine, read stocks off of a physical ticker tape, or operate a telegraph or printing press.
 
Maybe. I think we have to try to think unbiased about it though.

We have an affinity for this old tech because we lived through it, but how much does understanding it actually benefit students?

There are certainly going to be aspects of the tech that have "foundational building block" type of qualities that help them understand how the world works, but there are going to be other aspects of it that are arcane and completely useless to them, like how we don't really need or care to know the specifics of how to operate a Telex machine, read stocks off of a physical ticker tape, or operate a telegraph or printing press.

So you're saying that knowing the difference between a ditto and a mimeograph is not important? :unsure:
 
I often like to tell my co workers that IT Experience of more than 10 years is more trivia than value. I’ve been working over 20, but knowing modem AT codes, how to configure iPlanet, knowing the “Pencil trick” - it’s all just trivia. Things of value that I picked up 20 years ago are all things than can be learned with 10 years experience.
 
I often like to tell my co workers that IT Experience of more than 10 years is more trivia than value. I’ve been working over 20, but knowing modem AT codes, how to configure iPlanet, knowing the “Pencil trick” - it’s all just trivia. Things of value that I picked up 20 years ago are all things than can be learned with 10 years experience.
Ihateto say this...but you're wrong. I've been in IT work for just over 30 years now. The things I learned early on about DOS and comfort at command line and other tricks and tips ove absorbed are still informing my capability as a top tier troubleshooter. Atky company when people get stumped even about programs I've never seen. They call me. And the majority of the time I am able to lead the drive to a solution if not solve it myself. All of my experience in the realm of IT informs my ability to troubleshoot and solve. In addition to engineer and iterate.

I'll put the caveat out there that I have a very good nearly photographic memory when it comes to computer and nerdy stuff. So that is giving me the ability to call upon my experience to greater effect than some. So that might be making my experience and opinion inaccurate to others .

One example... look at the tool takeown. It's an old *** command line tool. You can adjust file access rights in windows shares beyond what you can do at the gui level. And you don't need to install a **** thing. I have used it with a bit of batch file goodness to repair access to millions of files in a share. Sure it was brute force. There may be a sysinternals or the like gui tool that could do it now. But it is a powerful tool for that specific use case.

Anyway I'm old but I'm still wiley.
 
I do feel old. Tech advances faster than I can keep up with it, especially since I'm not in the industry directly.

I do agree with Grimlakin - things that are not long exactly relevant still help out, you learned a skill and there will always be some aspect of that which will carry forward. But me being able to edit etc files with vi doesn't really help much ~directly~ in today's IT world, but I'm sure that something of that carries forward - the ability to pull out arcane key combinations from memory maybe - the ability to read a man page and figure it out - something I'm sure? But that also may just be me trying to pretend like all that hard stuff I had to do back in the day should still be relevent in some fashion, when maybe it just isn't really useful anymore - like knowing how to use a clutch or tuning a carburetor.

Tech moves so fast.

I would like to see some of that history better preserved and taught, but on the flip side of that coin, there are so many things that should be taught that aren't - like financial literacy, civics, knowing how to do your laundry, etc...
 
Last edited:
I do agree with Grimlakin - things that are not long exactly relevant still help out, you learned a skill and there will always be some aspect of that which will carry forward. But me being able to edit etc files with vi doesn't really help much ~directly~ in today's IT world, but I'm sure that something of that carries forward - the ability to pull out arcane key combinations from memory maybe - the ability to read a man page and figure it out - something I'm sure? But that also may just be me trying to pretend like all that hard stuff I had to do back in the day should still be relevent in some fashion, when maybe it just isn't really useful anymore - like knowing how to use a clutch or tuning a carburetor.
You're wrong as I pointed out in my example. I don't know if you operate in the enterprise world. Understanding the path that technology took to reach where we are now is a valuable skill and knowledge to have. Whether problem solving for something today, or planning what's coming in the future. Having that wealth of experience and knowledge informs the decisions I make today as well as ingesting the technology I can as it comes up and is applicable in my arena today.

For example... without getting into advanced switches and firewalls, understanding TCP/IP, even just IPv4 today, and what ports mean and are and how to facilitate communication between systems at it's core, even with IPv6, hasn't changed much. When you get into Invisible devices in the middle doing ID and IP roles, (Intrusion detection and Intrusion Prevention) you talk about something new.

Understanding how SSL works and what it is actually doing informs how you troubleshoot communications even at that level.

I could go on and on with knowledge I've had for decades and how it informs my abilities today in IT.

Not EVERY role in IT needs that knowledge. I'll admit that fully. But there again is a difference between a programmer and an engineer. :)
 
The things you're pointing out - modifying /etc/resolv.conf, knowing whats in /etc/shadow, and knowing how to command line configure zfs arrays are things a linux engineer can easily pickup in 10 years. A 10 year network engineer should know all the ins and outs of IPv4, IPv6, and understand things like VPCs. The vast majoirty of things that are more than 10 years old that someone with 10yr experience doesn't know, such as 56k modem codes, are just Trivia. No employer cares that I passed the windows 98 MCSE test module, or knew the Abit dip switch configurations for over clocking, or know how to manage memory in DOS.
 
True if you only want to be a singularly focused engineer you're absolutely right.
 
The term "GPU" wasn't even known in PC circles, until at least the geforce era. Until then we called them video cards, or display adapters. I remember that some retroactively claimed that the RIVA128 as the first true GPU, because it was the first decent 3D Accelerator to be combined with a 2D display adapter on a single card, but it is completely arbitrary nonsense coined by NV fanboys.
 
Become a Patron!
Back
Top