-Slicing and Dicing of data
-Capabilities to cater ad-hoc queries
Depends on what you mean by ad-hoc queries. Reporting is very dynamic, but users do NOT run queries against your database. They instead filter data that you've already loaded into memory.
-Implementation of cubes
QlikView does NOT use (or need) data cubes internally. It works in a different, and I personally think superior way.
-Capabilities to develop reusable data elements (e.g. webparts)
Depends on what you mean by reusable elements. There are a lot of reusable elements, but I'm not familier with webparts.
-Export functionality (PDF)
Yes, with QlikView Publisher.
Better than what? I like the UI just fine.
-Capabilities for mobile, i-pad reporting
Yes, but I think there's room for improvement here. The native iPad client I think doesn't take good advantage of the larger screen. You can publish in AJAX and pull it up in Safari on the iPad, but it doesn't work particularly well with the touch screen interface. They ARE actively working on it, though.
-Integration with .Net,MS Office,MS Sharepoint
Mmmm, not sure. You can export to and import from Excel. I know I've seen "sharepoint" mentioned many times on the forum. Not sure about .Net.
-Ease of use
For users, it's as easy as you make it for them as a developer. Generally pretty easy. For developers, I consider it fairly easy, though I think there are some definite areas for improvement.
-Response time, Memory consumption
Depends on how much data you have and how complicated you make your charts. Generally speaking, most of my real applications are near instantaneous or sub-second. Some of the more complicated ones are longer, with my absolute wost charts taking many seconds to finish calculating. For memory consumption, figure on about 10x compression of your raw data, but then add back in another 5-10% for every concurrent user of the data. Since analysis is done entirely in memory, you need that much RAM. You can't afford to swap to disk.
-Short build and deploy time
-Max. of Data Handling(2 TB or more)
I believe it was designed for gigabyte-scale databases. However, I am not aware of any technical limitations (other than available hardware) that would prevent you from using it on terabyte-scale databases. I think windows server maxes out at 2 TB, but in practice, you can run a lot of different applications on a lot of different servers to get around even that limitation. Should be doable, but I'd want to hear from people actually running that much data before making a commitment. Our shop probably has only 100 GB or less, so we don't count.
-Tech support(24 X 7)
I don't know tech support's hours. I've never called them. I email. But for most issues, I recommend the forum. It's very active and helpful.
-64 Bit OS support