Quantcast
Channel: SCN: Message List - SAP Adaptive Server Enterprise (SAP ASE) for Custom Applications
Viewing all articles
Browse latest Browse all 3587

Re: How to improve performance to insert large amount of data to a table?

$
0
0

Some possible reasons for slow insert performance:

 

- having to update indexes for each insert (dropping indexes may improve overall performance, assuming you don't have any 'read' processes running at the same time that need said indexes to perform well)

 

- having to perform RI/FK constraint checking (dropping RI/FK constraints may help improve overall performance, assuming you can forego the FK checks during your process)

 

- having to perform insert trigger processing (dropping/disabling/turning-off triggers may improve overall performance, assuming your batch process and any other inserts can live without the trigger logic)

 

- performing inserts one-at-a-time and without a transaction wrapper (grouping many inserts into a single transaction can reduce the volume of synchronous log writes, eg, begin tran ... perform 500 inserts, commit tran, begin tran ... perform 500 inserts ... commit tran, begin tran ... perform 500 inserts ... commit tran)

 

- using cursor-based processing instead of set-based processing (set-based processing is typically more efficient than cursor-based processing, assuming there are no technical reasons for the cursor-based approach)

 

- inserting rows to an APL table with large quantities of duplicate keys for a non-unique clustered index (as the overflow page chain grows the longer it takes to perform each insert)

 

- inefficient SQL coding that's eating up most of your wallclock time


Viewing all articles
Browse latest Browse all 3587

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>