It's a very sticky situation.. from what I understand even if you block the crawlers the content will still be indexed if anyone ever links to a page from another site. The only surefire way to block the indexing (even if someone links to the page) is to use the noindex metatag. I think if you put it in the Dolphin header that would block EVERYTHING from being indexed.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710
Even with that your content still might get scraped by any bots that don't follow the rules. If that's the case I think you might have to set it up so the content is only displayed if the user is logged in.
About the pages that are already indexed, if I'm not mistaken you can join Google Webmasters and request that they be removed from the index. I've never tried it so I'm not sure how it works or if it's effective.
BoonEx Certified Host: Zarconia.net - Fully Supported Shared and Dedicated for Dolphin