#import "globals.typ": * = Halo growth == Motivation #layouts.contained( [ Crucial dependence on the *star formation rate* - assumed to be directly linked to halo growth rate $dot(M)$: $ dot(M)_star = f_star (M_"h") dot dot(M_"h") $ - growth according to the exponential model: $ M_"h" (z) = M_"h" (z_0) dot exp[-alpha (z - z_0)] $ with $alpha = dot(M_"h")/M_"h"$ the _specific growth rate_ #pause ], [ $->$ inaccurate when applied to all halos $->$ #text(weight: "bold")[inconsistent] with the N-body output #pause $->$ how to implement #text(weight: "bold")[consistent] growth? ] ) == Effect on the flux profiles #let notebook = json("../workdir/11_visualization/alpha_dependence_of_profiles.ipynb") #image_cell(notebook, cell_id: "profile_plot_alpha_dependence") $M_"h" = 6.08 dot 10^11 M_dot.circle$ (fixed) $==>$ correction up to $times 5$ // COMMENTS // That will be directly affect the global signal as well // shifting // // Yu-Siu already investigated the more nuanced effect of stochasticity but the approach we propose should supersede that #pagebreak() == Inferring growth from #smallcaps[Thesan] data - already includes precomputed merger trees @Springel2005 // ideal for rapid iterations - follow main progenitor branch back in time - fit the exponential model to main progrenitor branch // in a parallelized fashion => want to stay fast // fix the original mass for max. consistency // fix the allowed dynamic range - use *individual growth* to select profile // this sort of "breaks the degeneracy" between halos of the same mass but different growth histories - *self-consistent*#pause$*$#meanwhile treatment of halo growth leveraging the snapshots #pagebreak() #let notebook = json("../workdir/11_visualization/show_trees.ipynb") #[ #set image(width: 100%, fit: "contain") #image_cell(notebook, cell_id: "merger_tree_and_fitting") ] // COMMENTS: // no clear trend between mass and growth rate