Deano, thanks for the core code mod above and the robots.txt info.
Robots.txt is a handy way to reduce duplicate content on Dolphin (or any other platform) sites.
If one wants to cut down on the duplicated content on the site then add the file or directory to the robots.txt file as per Deano's pattern above and, over time, the unwanted pages will remove themselves from SERPS and you should start to see in increase in ranking for the better focused, unduplicated pages that remain.
pmarinac, I will check out your suggestions for bulk removal of pages as the process does take quite a while without a shove from the webmaster. ;)
Thanks,
Andrew.
[EDIT] Here is the link to the Chrome extension pmarinac mentioned: http://packershack.com/blog/bulk-url-removal-google-webmaster-tools