Algorithm? What's that then?
Chambers 20th Century Dictionary describes an algorithm as: "a rule for solving a mathematical problem in a finite number of steps". And in the Computing Dictionary
"Technically an algorithm must reach a result after a finite number of steps, thus ruling out brute force search methods for certain problems, though some might claim that brute force search was also a valid (generic) algorithm. The term is also used loosely for any sequence of actions (which may or may not terminate)."
My view? Well, I think of an algorithm as a machine, comparable to an engine, or some device, for solving a problem. An example: the problem might be powering a motor car. There would be a host of different engine types: diesel, petrol, gas turbine, steam, electric. To me, they look like algorithms for the problem of propelling a car. I sometimes think of a bridge as an algorithm for crossing a gulf or river. There are many different kinds of bridges: suspension, girder, etc and each has its role and cost. Same with the car engine: different role and costs.

In some sense I visualise algorithms in terms of cogs, wheels and cranks: a mechanical thing.

You have been using algorithms all the time, and probably weren't aware of it. You would be given an algorithm for long division, multiplication, addition and subtraction when in primary school. You were the computer, and you executed those algorithms. You benefit from algorithms every time you play some arcade game, with high performance graphics. There will be a host of algorithms for working out when two objects collide (in the shootemup games, or car chases), when you are past a given point, etc. Calculators use algorithms, to find logarithms, square roots, factorial, etc. There is an algorithm in my video camera, to reduce camera shake.

Algorithms aint programs, just as a recipe for steamed fish aint a meal (it just tells you how to steam the fish). I could describe an algorithm in English or in some "pseudo code" (something that looks like a very high level language, but we don't have a compiler for it). So, an algorithm aint necessarily a piece of code. An algorithm might be "on the shelf", in a book, and when you have a specific computational problem, you might be able to identify that problem as being an instance of (lets say) X, and we know that there is an algorithm for X. So we go and find it, and code it up.

And there are good and not so good algorithms. If you are implementing a solution to a given problem you might have a choice of algorithms. If you choose the wrong one you might get bad performance and the system will be unusable. You could get it right, and end up with a market leader!

There are loads of algorithms. There are algorithms for putting things in place, for locating something, for finding some configuration that is optimal (in some sense). There are conferences and journals devoted to algorithms, books, and web sites. There is a huge body of research in algorithms: trying to find better algorithms, and trying to better understand the behaviour of the algorithms that we already have. And I must confess, I do a bit myself.

Algorithms are ancient
The gentleman on the stamp is Abu Jafar Mohammed ibm Musa Al Khwarizmi, who first suggested a mechanical method for adding two numbers represented in the Hindiu numeral system (circa A.D. 800). And that is how we get the word algorithm (miss-pronounce Alquarizmi?) . I stole this picture from Donald Knuth's home page.

But there are algorithms much older than Abu Jafar Mohammed ibm Musa Al Khwarizmi. The mathematician Euclid (greek mathematician of 3d century BC Alexandria) is credited with amongst other things (i.e. The Elements) the description of a technique for finding the highest common factor (HCF) of two integers by:

dividing the larger by the smaller, then the smaller by the remainder of that first division, then the remainder of the first division by the remainder of the second, and so on until the process terminates with a remainder of zero.

Algorithms are important
It's important that you know about them, know that algorithms exist for certain problems so that you don't have to reinvent the wheel. It's important that you know something about algorithms, such as the cost of using them, so that you can make sound engineering decisions when you solve a problem. It is true that even though we get gains in performance from our systems as hardware performance increases, this improvement can frequently be insignificant compared to improvements brought about by algorithms. It's not unusual to see a problem solved millions of times quicker by one algorithm than another.
So, what are abstract data types and data structures all about then?
Well, it's about trees, and stacks, and lists, queues, deques, doubly linked lists, and heaps, graphs, digraphs, arrays, vectors; all those things. But those are just words (heaps, deque?) aren't they? Here's what the Computing Dictionary had to say about abstract data types:

abstract data type
(ADT) A type whose internal form is hidden behind a set of access functions. Objects of the type are created and inspected only by calls to the access functions. This allows the implementation of the type to be changed without requiring any changes outside the module in which it is defined. Abstract data types are central to object-oriented programming where every class is an ADT. A classic example of an ADT is a stack data type for which functions might be provided to create an empty stack, to push values onto a stack and to pop values from a stack. (22 Feb 1995)

So, an abstract data type (adt) implements some property, but hides from the user (programmer) just how it is implemented. As an example, we might have an ADT for a set. We would then require methods to create a set, test for membership, union, intersection, set-difference, addition, and deletion. We give those to the user, and we implement it. We can implement the set ADT anyway we choose, so long as we provide the required functionality.

How we implement that functionality brings us to data structures. For the set example above we might use an array or vector. This will have advantages and disadvantages. We might use instead a linked list, or a hash table. Each one of these representations, ways of structuring the data will have advantages and disadvantages, such as speed and space consumption. Again, if we make a bad choice of data structure we might deliver an unusable system that is too slow or takes up so much space that we can only work on small data sets. Get it right, and again we could have a market leader!

Why are data structures important?
For pretty much the same reason that algorithms are important. Data structures have costs and benefits. You want to know about these, so that when you make a design decision, it is an informed one, not just a guess or a hunch. Furthermore, when you make your choice you have an understanding of the consequences of that choice.
In Summary?
A computer scientist needs to know about, and understand, algorithms and data structures so that they can make sound engineering decisions, so that they are more than just a hacker producing systems with unpredictable behaviour.
Oh, and I nearly forgot. They are also fun, and sometimes beautiful.