Manali Kapoor
Feb 27, 2020 | Last updated: Jan 11, 2023
Expert articles | 5 min read

QueryExpression or FetchXML is one of the most common ways to retrieve data from Dynamics 365 in C# and it’s so convenient, isn’t it?

But if there are more than 5,000 records, which is the case in quite a few organizations, we will be bound to use Paging on the top of a simple QueryExpression or FetchXML.

By default, an organization service can return only the first 5,000 records, and page details for the next set of records if existant.

While I was trying to find the correct way to retrieve all the records by paging, most of the examples that I found were for FetchXML. So, in this blog post I will be using QueryExpression not only to extract all the records but also to work with them.

First, let’s create a simple QueryExpression for extracting all records in Dynamics 365:

QueryExpression query = new QueryExpression("account");
         query.ColumnSet = new ColumnSet("name");

For fewer than 5,000 records you would use:

  var accountCollection = service.RetrieveMultiple(query);

How do we extract more than 5,000 records?

Paging for the win! Where do we find this paging? The answer is in the QueryExpression response in PageInfo. We need to extract that paging-cookie from response which we can send to our next page request. In addition to the paging-cookie, we also get MoreRecords in the response which is Boolean and tells us if there are any more records to fetch.

This returns “true” or “false”, which helps us to determine if we reached the last page or whether there are still records to fetch.

QueryExpression query = new QueryExpression("account");
         query.ColumnSet = new ColumnSet("name");
         var completeAccountCollection = RetrieveAllRecords(service, query);

         public static List<Entity> RetrieveAllRecords(IOrganizationService service,QueryExpression query)
            var pageNumber = 1;
            var pagingCookie = string.Empty;
            var result = new List<Entity>();
            EntityCollection resp;
               if (pageNumber != 1)
                  query.PageInfo.PageNumber = pageNumber;
                  query.PageInfo.PagingCookie = pagingCookie;
               resp = service.RetrieveMultiple(query);
               if (resp.MoreRecords)
                  pagingCookie = resp.PagingCookie;
               //Add the result from RetrieveMultiple to the List to be returned.
            while (resp.MoreRecords);

            return result;

Here I am using a do-while loop to retrieve records until the response says MoreRecords is true i.e. more records still exist on the next page.

The code above will return us all records in the Dynamics 365 organization. But in most cases, you would also be required to do something with the retrieved records. Let’s see how we can do that.

Work with the retrieved records

Let’s say we would want to update one field in all records. In Dynamics 365 if we want to bulk update, delete or create records, we use ExecuteMultipleRequest. Since Execute takes OrganizationRequest as a parameter, we will be using UpdateRequest to perform the update. We could also use service.Update() but performing operations in bulk using service.Execute() is more performant and takes less time.

List<UpdateRequest> requests = completeAccountCollection.Select((entity) =>
            Entity myEntity = new Entity("account");
            myEntity.Id = entity.Id;
            myEntity["status"] = new OptionSetValue(0);
            //Instead of updating here, we will be returning an UpdateRequest to use it in Execute.
            return new UpdateRequest { Target = myEntity };

The code below not only executes the data in bulk but also gives a progress report where we are in the process:

//There is a limit to how many requests can be added to a request collection. If that limit is exceeded, 
a fault is thrown before the first request is ever executed.
	  //A limit of 1.000 requests is typical but in reality it depends on the CRM server processing time.
         //So we split the list of requests to child list.
         //specify the size of the batch i.e. 30 here (to be on the safe side).        
         var chunkSize = 30;
         var chunks = SplitList(requests, chunkSize).ToList();
            .Select((mychunk, index) => (mychunk, index: index + 1)).ToList()
            .ForEach(x =>
               //Create an empty organization request collection.
               var orc = new OrganizationRequestCollection();
               // Add the UpdateRequest for each entity to the org request collection.

               var emRequest = new ExecuteMultipleRequest()
                  //Assign settings that define execution behavior: continue on error, return responses.
                  Settings = new ExecuteMultipleSettings() { ContinueOnError = false, ReturnResponses = true },
                  Requests = orc

                  //Execute all requests in the request collection using a single web method call.
                  ExecuteMultipleResponse response = (ExecuteMultipleResponse)service.Execute(emRequest);

                  //Code below is for tracking the progress chunk by chunk.
                  if (response.IsFaulted == true)
                     foreach (var myresp in response.Responses)
                        Console.WriteLine($"Error in processing chunk {x.index} .Error : {myresp.Fault.Message}");
                     Console.WriteLine($"\rUpdated chunk {x.index} out of {chunks.Count}.");
               catch (Exception ex)
                  Console.WriteLine($"Exception Occured in chunk {x.index} chunk {ex.ToString()}.");

         /// <summary>
         /// Function to split the list into child list
         /// </summary> 
         public static IEnumerable<List<T>> SplitList<T>(List<T> collection, int nSize = 30)
            for (int i = 0; i < collection.Count; i += nSize)
               yield return collection.GetRange(i, Math.Min(nSize, collection.Count - i));

Just like the update request, you can create a DeleteRequest or CreateRequests as required.

Hope this code helps you next time you face difficulty with that huge Dynamics data!