Monday, March 13, 2017

Suggestions on posting huge string to web service

Leave a Comment

Below is my array :

var.child.Cars1 = { name:null,operation:0,selected : false}  

Now in above array,selected property represent check/uncheck status of checkbox and i am posting above array to web service(WCF) as string using json.stringify.

Above array contains 2000 - 4000 records and now user can check/uncheck checkboxes.

Now consider there are 4000 records in above array in which there are 2000 records which are checked and 2000 records are unchecked and in my web service i am processing only those records which are checked.I remove records with selected value as false.

Now as due to 4000 records its a huge json string and because of that i get error from web service end :

  Error :  (413) Request Entity Too Large 

Now reason why i am not filtering out records with selected as flase is because it will create lots of overhead on client browser and can even hang browser so right now i am doing it on server side.

So my question is that i should filter out records with selected as false on client side and then post 2000 records only or what i am doing is the right way.

I have some question in my mind that posting such huge json string will again take some times and filtering out records with selected as false will also put lots of overhead on browser.

So i am not sure i am doing wrong or right.

Can anybody please guide me for this???

8 Answers

Answers 1

It is preferable to filter data on the client side provided you do not impact the UI thread (lockup the browser). Posting too much data can cause issues depending on what you are trying to do and the network speed of the end user. That being said, sometimes you just have to post large data. I'm not sure how large those 4000 records are, but if it's only text then it can't be too large.

Since your issue is that the WCF site is responding back with the 413 response, that's where you are having your max size issue. According to the WCF documentation, the maximum allowable message size that can be received by default is 65,536 bytes. Obviously, that is below what you are attempting to send.

This is what you need to update in your WCF service. Sample below is for 10MB, but increase to whatever makes sense for your data.

<system.serviceModel>   <bindings>     <basicHttpBinding>       <!-- Measured in Bytes -->       <binding maxReceivedMessageSize="10485760">    <!-- 10 MB-->         <readerQuotas ... />       </binding>     </basicHttpBinding>   </bindings>   </system.serviceModel> 

If you begin getting a 404.13 HTTP status code, then you also need to update your web.config to allow longer max sizes, but set it to whatever size makes the most sense for your application.

<system.webServer>     <!-- Add this section for file size... -->     <security>       <requestFiltering>         <!-- Measured in Bytes -->         <requestLimits maxAllowedContentLength="1073741824" />  <!-- 1 GB-->       </requestFiltering>     </security> </system.webServer>  <system.web>     <!-- Measured in kilobytes -->     <httpRuntime maxRequestLength="1048576" />   <!-- 1 GB--> </system.web> 

Answers 2

A quick fix might be increasing the servers allowed content length. This is approximately what that would look like.

<configuration>     <system.web>        <httpRuntime maxRequestLength="2147483647" />     </system.web> </configuration>  <system.webServer>     <security>        <requestFiltering>            <requestLimits maxAllowedContentLength="2147483647" />        </requestFiltering>     </security> </system.webServer> 

Answers 3

Generally if you start running in to the large request size issues like this, it is a good opportunity to look at ways to optimize instead of override. A lot of people have tried providing ways to circumvent the problem, but have not provided insight as to ways that don't circumvent it, but instead improve upon the design to be more lightweight.

Here are some possibilities:

Have you considered paging the request(s)? This would allow you to asynchronously load the data on the client as needed, thus preventing the requests from taking too long, improving the responsiveness of the website, and reducing any burdens on both the client and server memory wise. You can preemptively load the data as a user scrolls, and if need be, provide some sort of entertainment/feedback to the user if the process is taking too long, so they know there's more data being loaded.

Have you considered changing the names of the properties to be shorter, and less descriptive, reducing the footprint of the object itself? For example:

Your current model:

{ name:null,operation:0,selected : false}

A simplified model:

{ n: null, o: 0, s: false }

An approach such as this will make it harder to read the JSON itself, but JSON isn't meant solely to be read by people, it's meant to serialize data; although, this can be overcome by documenting your model. Doing it this way may help to reduce your data being sent by up to 30%.

I cannot provide a complete approach to the solution because you will have to ask yourself a lot of hard questions as to what you're trying to achieve, who will be consuming the data, and how is the best way to get there.

In addition, I would strongly consider questioning why a process would require a user to interact with 2000+ records at once. I am not trying to criticize, but say you need to take a critical look at the business process behind what you're trying to achieve because there may be serious issues with repetitiveness, stress on the user and more, which would greatly affect how your application will be effective and useful to the end-user. As an example, is there ways you can break up the task in to smaller, less tedious blocks so that the end user doesn't stare at 4000 checkboxes for 2 hours?

This may not be the answer you're looking for as it opens up a huge number of additional questions, but hopefully it will help you start to formulate questions which help shape the final answer.

Answers 4

  1. Save json value as file
  2. Upload the json file (while getting unique filename)
  3. Call your WCF method with filename passed instead of json value
  4. Read the data from the file passed to the method

Answers 5

You should filter on the client, as you don't know how slow the connection to the server might be. Sending unneeded data is likely to be slower than filtering it first.

If filtering the data is hanging the browser, then you can get around this problem by using a WebWorker.

Answers 6

if you are using GET method so this is not working because Get method URL maximum character limit is 2,083 character. so please use post method in web service using Json object.

Answers 7

You can set maxRequestLength in config.These settings worked for me to upload 750 mb.

<system.web>     <httpRuntime maxRequestLength="2097151" /> </system.web> <system.webServer>     <security>         <requestFiltering>             <requestLimits maxAllowedContentLength="2147483648" />         </requestFiltering>     </security> </system.webServer> 

maxRequestLength maximum is 2097151, If you try set more error occurred. And MSDN says ;

The default size is 4096 KB (4 MB).

Answers 8

Filtering on a client side should not create a huge overhead and it should be possible to sustain the resulting json string size within the request limits.

Using underscore.js is pretty straightforward to filter to only selected/checked elements and map to only required operations:

var filteredCars = _.where(cars, {selected: true}); //an assumption that operation is a unique car id var operations = _.map(filteredCars, function(car){ return {id: car.operation};}); 

Please see this extended JSFiddle for filtering with the similar size of elements.

If You Enjoyed This, Take 5 Seconds To Share It

0 comments:

Post a Comment