SQL Server can take all the memory available in the box if only there is large enough database (a database with data size bigger than the amount of RAM). It's normal because SQL Server tries to store as much data in the buffer cache as it can (to avoid physical reads from disks). To prevent SQL Server from taking all the memory for its buffer cache you can set the maximum server memory setting for the SQL Server instance. See here for details:
http://msdn.microsoft.com/en-us/library/ms178067.aspx (there is a code at the bottom of the page which you can use as an example).