My next project involves creating an enterprise data API. Data will be consumed by several applications running on different software platforms. Although my colleagues generally favor SOAP, I would like to use the RESTful architecture.
For most applications, only a few objects are required for each call. However, other applications sometimes need to make several consecutive calls, each of which includes thousands of records. I'm worried about performance. Serialization / deserialization and network usage is where I am afraid to find a bottleneck. If each request is associated with a long delay, all corporate applications will be sluggish.
Are my fears realistic? Would serialization in bulk format like XML or JSON be a problem? Are there any alternatives?
In the past, we had to perform these large data transfers using a “flatter” / more compact file format such as CSV for performance. How can I hope to achieve the required performance using a web service?
While I prefer REST specific answers, I am interested to know how SOAP users can handle this.
source
share