Depending on how they are dumping the SQL backups to disk (or any type of file for that matter), there could potentially have been a compression and/or encryption algorithim used first. It is important to remember that compressed data does not dedupe well, because a small change can drastically effect the bit strings on compression.
So always remember that it needs to flow in this methodology: Deduplication -> Compression -> Encryption. All the vendors are using compression as part of their advertising dedupe ratios. Also, don't believe the marketing approach to whatever dedupe ratios will exist (Yes, there are some algorithms that will perform slightly better than others, but it REALLY depends on your data and the uniqueness of the data being created).
Also databases do not normally produce phenomenal deduplication ratios (although its not uncommon to see between 3:1 to 6:1 depending on the resource and post-compression) because most data is uniquely structured within the database itself. Its easier to find dedupe ratios in unstructured data such as file servers.
If you can convince your DBAs, I always recommend using an agent to perform a backup to any deduplication solution so that the compression/dedupe/encryption stages are handled by the backup software. That and it reduces 1 less stage of I/O hit on your environment by not having to be staged to a disk first before the actual backup.
Finally, you need to be fully aware that unique media based data will not be deduplicated ever with other files and will only deduplicate with itself if its already been backed up before. This type of data includes pictures, audio files, video files, scanned documents, etc. You'll get better bang for your buck using special compression algorithms on those type of files.
Hope that helped!