Thanks for the help.
I've run set option show_pio_costing on on a 2 table join and I get 583 lines of output.
It seems to consist of 26 sections of "START NEW COSTING" and a "FINALPLAN"
I can vaguely see that each COSTING section then has a CACHE COSTING and a PIO COSTING
with lines like this
(BUF CACHE FOR INDEX SCAN - B_PK, CacheId=2, MassSize=16384, Fraction=1, MRU=0, WashSize=417792, PoolSize=5222400, CacheAvail=5222400, FrontierCt=1, 2K_PIO=141815, PageCR=0.9714535, MassSize=16384, Outer=1, InOrder=1.531751e+07, ScanPg=141812, LIO=141815, PIO=21269.15)
The total cost in the final plan doesn't seem to relate to any of the other plan costs that show up.
In fact a lot of the costs show use as zero
eg
( PopIndScan A_PK DB..A t ) cost:0 tempdb:0 order: none
( PopRidJoin ( PopIndScan A_PK DB..At ) ) cost:0 tempdb:0 order: none
etc
> Most information of set option show_* output are for engineering to investigate an issue.
Is any of this documented to enable developers understand what's happening.