my experience is that when you use pretty URLs or the standard URLs consistently throughout your site, you shouldn't have a problem. E.g. if pretty URLs, than make sure that you do not any "unpretty" URLs somewhere behind a link. If you ensure consistency in your template design (also check module, rss or whatever templates used), everything should be fine.
Ok, if people link to you with the wrong URL scheme, you still have the wrong links, in this case, 301 redirect URL, contact other webmaster and you should be fine. But then again, I just copy paste URLs... I don't think someone would purposely alter the scheme to dilute your results (mean black hat method now as I think of it).
If you already face the duplicated content problem such as having two identical pages rank for the same keyword not identified by Google as duplicated content, correct the according links and 301 redirect the wrong URL to the correct one. With time your results should become clean. But you can keep the potential link juice. I admit, with 1000 pages, some automatism would be nice. Probably it can be solved with a generic rule redirecting the one pattern to the other. You could also use robot.txt to disallow the wrong urls....
If content is actually different but indicated as duplicated content, try to use unique titles. If keywords and description meta data is used, make sure they are unique too. Also ensure that the automatically generated stuff like menu, teasers, etc is not overwhelmingly more text than the actual content.
Concerning the sitemap: it is a valuable tool for SEO activities... however, as suggested, it compliments the regular crawling and can help you to make sure that your important pages get crawled. Make sure that sitemap and content pages show the same link structure.
Concerning the whole pretty URLs discussion, I do not think that it is such a huge issue. First of all, search engines mostly perfectly understand http://example.com/index.php?page=home. Yes, it's not pretty and URLs are a stronger ranking factor, but your keyword is there. If one wants the last tiny bit, one has to create sexy rewrite rules. That's something the admin should be able to handle for your (or check the dozens of tutorials out there in the web). Regular expressions are hard to learn (I think) but definitely worth it.
Additionally, I think that well written and well marked-up content (e.g. using micro formats, headlines, paragraph tags etc) is much more important - if you have to decide where to invest your energy

http://www.seomoz.org/article/search-ranking-factors
A great tool for checking the link structure in your sandbox already is XENU's Link Sleuth http://home.snafu.de/tilman/xenulink.html it will crawl links in your site, the results will quickly tell you if there's a problem.
Just some thoughts
Best
Nils
-------
edit: @jeremy... hehe you where just posting when I was still writing
