Carnegie Mellon Launches Undergraduate Degree In AI (cmu.edu) 76
Earlier this week, Carnegie Mellon University announced plans to offer an undergrad degree in artificial intelligence. The news may be especially attractive for students given how much tech giants have been ramping up their AI efforts in the recent years, and how U.S. News & World Report ranked Carnegie Mellon University as the No. 1 graduate school for AI. An anonymous reader shares the announcement with us: Carnegie Mellon University's School of Computer Science will offer a new undergraduate degree in artificial intelligence beginning this fall, providing students with in-depth knowledge of how to transform large amounts of data into actionable decisions. SCS has created the new AI degree, the first offered by a U.S. university, in response to extraordinary technical breakthroughs in AI and the growing demand by students and employers for training that prepares people for careers in AI.
The bachelor's degree program in computer science teaches students to think broadly about methods that can accomplish a wide variety of tasks across many disciplines, said Reid Simmons, research professor of robotics and computer science and director of the new AI degree program. The bachelor's degree in AI will focus more on how complex inputs -- such as vision, language and huge databases -- are used to make decisions or enhance human capabilities, he added. AI majors will receive the same solid grounding in computer science and math courses as other computer science students. In addition, they will have additional course work in AI-related subjects such as statistics and probability, computational modeling, machine learning, and symbolic computation. Simmons said the program also would include a strong emphasis on ethics and social responsibility. This will include independent study opportunities in using AI for social good, such as improving transportation, health care or education.
The bachelor's degree program in computer science teaches students to think broadly about methods that can accomplish a wide variety of tasks across many disciplines, said Reid Simmons, research professor of robotics and computer science and director of the new AI degree program. The bachelor's degree in AI will focus more on how complex inputs -- such as vision, language and huge databases -- are used to make decisions or enhance human capabilities, he added. AI majors will receive the same solid grounding in computer science and math courses as other computer science students. In addition, they will have additional course work in AI-related subjects such as statistics and probability, computational modeling, machine learning, and symbolic computation. Simmons said the program also would include a strong emphasis on ethics and social responsibility. This will include independent study opportunities in using AI for social good, such as improving transportation, health care or education.
The cycle begins again. (Score:1)
Re: (Score:1)
Re: (Score:2)
Look at a list of the world's biggest companies [wikipedia.org]. Seven of the top ten are tech companies, and five of those did not exist before the web. Those five have a combined value of $3 Trillion. So saying that AI is going to "fail just like the web" is a bit silly.
Re: (Score:2)
I suspect rather than being on the 'backburner' implying it will be dead, it'll just not require much investment and be 'boring' but still there. There is a fad aspect to it resembling the .com boom (There's a commercial trying hard to show the value of voice assistant + IoT by having someone feed there pet while driving around, which is really trying hard to solve a problem that pet owners don't have), but I don't think it's so burdensome as to warrant killing off even if the fad subsides and people stop
Re: (Score:2)
Those I think illustrate the point. Of the 'web native' companies that entirely (5 of them) only 2 of them were in the 'dotcom' boom (amazon and google) and google was a much less overwhelming business pressure. The other 3 web-native all came later.
In the fullness of time, the place for the "web" is clear and permeates our world, but in the dot-com boom, the vast majority of the industry went bust, because the technology had a lot of promise, but all these companies were blindly applying the technology
Re: (Score:2)
A lot of those companies invested exclusively in the web only presence or depended on other companies for content. Those companies went bust when the bricks'n'mortar companies created their own websites.
There was a company called getgooey who had the idea that they could allow users to run overlays over another companies website. You added their plugin into your browser and everyone could just add their comments through their servers. That never took on.
Re: (Score:2)
Right, similarly, companies that incorporate AI into their business will more likely have AI 10 years from now (some of those will fail to find an application for sure, but some of them will see success), startups that 'just do AI' are probably going to be gone within 5 years because it's the sort of function that will just be embedded into other businesses rather than be a business in and of itself.
Re: (Score:2)
And which of those did not exist before the web?
Alphabet? Because it is a new "financial construct"?
Re: (Score:1)
Nothing new (Score:1)
So what's new?
I studied AI as my bachelor in 2009-2013 at Utrecht University. My study was competing with 3 other AI bachelor's programmes in a 100km radius. It's pretty rare for a university that's reasonably developed in technical fields to NOT offer a bachelor's programme on AI nowadays.
Re: (Score:1)
I studied AI as my bachelor in 2009-2013 at Utrecht University. My study was competing with 3 other AI bachelor's programmes in a 100km radius. It's pretty rare for a university that's reasonably developed in technical fields to NOT offer a bachelor's programme on AI nowadays.
Back when I was an undergraduate, it was more common for AI to be offered as a "concentration" within the computer science degree program if it was even offered at all. In those days it was really more of an area for graduate studies because frankly the field just wasn't as developed as it is now and we didn't know as much. However, in the years since we have seen the pace of research and discoveries in this area increase as the tools have finally become somewhat more equal to the tasks. It does not surpris
Re: (Score:1)
Back when I was an undergraduate, it was more common for AI to be offered as a "concentration" within the computer science degree program if it was even offered at all.
There was a Cybernetics and AI programme on the CTU FEE in the 1990s (with its own department). It seemed only logical that AI would be tied to cybernetics. Probably more than that it should be tied to CS.
Same. Huge demand for AI, but ho-hum to me (Score:5, Insightful)
I was thinking along similar lines before I read your post. I'm about to head back to school for my masters and put some thought into which area I wanted to study. Partly, I want to retire as early as I can, which means making good money first. There is a huge demand for AI professionals, leading to high salaries. It just doesn't interest me much, though.
In my case I think it's partially because I've been on a software quality kick the last few years. If aerospace engineering was done like software engineering, planes would crash every day. It doesn't have to be like that. We can do it right, the first time. The attitude of "it seems like it pretty much worked when I tried it, let's ship it" gets on my nerves.
While AI isn't exactly "it seems like it pretty much works", it tends to lean much more in that direction than the systems I want to create, systems about which I can say "this is known to be absolutely correct; it has been mathematically proven correct".
âoeIf we want to be serious about quality, it is time to get tired of finding bugs and start preventing their happening in the first place.ââ" Alan Page
Re: (Score:3)
I want to create, systems about which I can say "this is known to be absolutely correct; it has been mathematically proven correct".
At best, you can hope that you can mathematically prove that the system conforms to the specification. Whether the specification is actually what you (or the customer) wanted is still unsure.
Re: (Score:3)
"Beware of bugs in the above code; I have only proved it correct, not tried it." - Donald Knuth
The whole 'provably correct code' disappeared from reality as soon as I was half a step beyond academia.
I think I get his sentiment though, AI isn't programming so much as it is a data scientist thing. This is one of the interesting challenges as a technology, the vast majority of folks having deep engagement with the technology are not programmers, but currently the tools require a bit of programmer sensibility
Times are changing. New tools provide new abilitie (Score:3)
>The whole 'provably correct code' disappeared from reality as soon as I was half a step beyond academia.
It did at one point. Maybe around 1988 or so. In the 1970s programmers were people with degrees in math, so there was a lot more correctness. As math majors, they had done plenty of mathematical proofs, so the idea of knowing that you're getting the right answer made sense to my mom's generation.
We've had a phase of "sloppy" programming for a while now, but over that time our tools have improved im
I went off topic at the end :) (Score:2)
I went off on a tangent at the end there.
Obviously code review doesn't guarantee or prove anything.
Other techniques CAN guarantee, or prove, certain things about the code, and it doesn't have to be time-consuming or expensive. Heck just using a strongly typed language guarantees certain things that aren't guaranteed in languages without strong typing.
Re: (Score:2)
They were researching "formal verification methods" in the 1990's. Using techniques like temporal logic, and automated deduction engines, they could formally verify that a CPU would be correct in all state transitions. There wouldn't be a point where an instruction could return in the wrong security ring.
Re: (Score:2)
I have hope that better and better tools and processes will be developed, and I'd like to help develop them. So far I've started by applying practices such as code review in organizations that didn't previously do it. We've found that code review / peer review reduces bugs enough to make it worthwhile.
This stands out in this day and age. I'm glad you've successfully introduced peer review, but I haven't heard of a shop in the last decade that doesn't implement peer review. Sounds like a change at the top is needed if you're at a shop that far behind.
It was a tiny company. What else is really used? (Score:2)
There has in fact been change at the top. It was a tiny company. About a year before I joined they had one "programmer" who wrote all the code. He wasn't trained as a programmer. A family started the business together. The brother who was "good at computers" did all the code. Since then, it's been bought by a larger company with more mature processes, but headquarters still mostly leaves us alone and let's us do things our own way.
In the last two years I've implemented code review, introduced test scripts,
Re: (Score:2)
What other practices have you seen used a lot, practical processes which really provide clear value?
Measure key metrics like cycle time, quality (bugs), code coverage, test stability, and escapes. Review weekly as a team, however small that may be. You generally don't optimize or improve what isn't measured and regularly reviewed. Release small changes often rather than big changes less frequently, and automate as much of the release process as possible.
Re: (Score:2)
The concept of bugs in 'proven correct' functions doesn't make sense. If a function is proven correct, it is correct or a mistake was made in the proof so it really isn't proven correct.. There's a relatively small domain of functions that are useful and can be proven correct, hence why proving a function correct in practice doesn't make sense in the real world, but it can during a college curriculum.
The concept of a bug in code that you can be 100% certain that only impact a single function is also not r
That too. $4,500 for degree is quickly recouped (Score:2)
> Have a job paying 60k, save 30k, and for each day you work you earn one day of retirement.
That's true! It's something I'm working on.
> rather than spending $$$ going back to school in the hopes of a higher paying job.
After the tax credit, my masters from Georgia Tech will only cost me about $4,500. Maybe less if I can get my employer to pitch in or something. Conservatively, my masters should bump my income by *at least* $5K / year, so it'll pay for itself the first year. After that, it's an extr
Typo: inexpensive (Score:2)
I just noticed a typo in what I wrote. My bachelors was INexpensive, not expensive. It should read:
--
My bachelors was also an inexpensive online program offered by a respectable university. The degree program increased my income enough to pay for the school even BEFORE I graduated.
--
Re: (Score:2)
Most of that machine-learning stuff is really image processing. The research papers were going as far as gradient aligned anisotropic sampling filters before they suddenly jumped into neural networks and machine learning. That stuff is/was necessary for the movie production industry because they normally hired qualified animators to spend their days airbrushing out wires and props, doing lip-sync, and fixing just about anything else. Even car driving is basically matching what the sensors detect with the co
Good curriculum (Score:2)
Just comparing it with my undergrad curriculum, which made sure that at least half of my classes were NOT related to my major, I'd say this gives a solid foundation. I would give some more stats courses beyond regression and intro do probability, though.
Re: (Score:1)
You don't understand college. Your high school, college, parents, and advisors have failed you.
I have a Masters in CS with a focus on AI, graduated last year. There hasn't been "extraordinary technical breakthroughs in AI". There has been the slow incremental algorithm improvements as is natural and a large increase in hardware capacity. AI isn't advancing itself. Hardware is doing all the primary advancement and farming out tasks to the general public to generate your massive training set is the sourc
Forget it, just get a CS degree (Score:2)
In 4 years the bloom will probably be off the AI rose already.
Imagine if some hip college started offering a degree in 3-D printing 5years ago and you invested the time and money to get one. Where would you be now? And don't forget, 3-D printing was just as big back then as AI is now, it was going to fundamentally change the world in ways no one could've even dream of.
Too little, too late. (Score:3)
"Undergraduate Degree In AI"
AI needs Overgraduates.
Re: (Score:2)
Curriculum (Score:3)
Standard CMU undergrad CS curriculum: https://csd.cs.cmu.edu/academic/undergraduate/bachelors-curriculum-admitted-2017 [cmu.edu]
CMU AI degree curriculum: https://www.cs.cmu.edu/bs-in-artificial-intelligence/curriculum [cmu.edu]
I dunno. IMO this could be a concentration or a graduate program. I think a classical undergrad CS program would be worth more to a student because it's more generic and thus more widely applicable.
Re: (Score:2)
Re: (Score:1)
And back in the day they refused to even offer an undergrad degree in CS. Funny how they change their minds.
Better as a required course (Score:1)
Qualifications? (Score:1)
Weird (Score:2)
Well your degree will be useless after 2045 (Score:1)