References(57)
[1]
P. Indyk and R. Motwani, Approximate nearest neighbors: Towards removing the curse of dimensionality, in Proc. 30th Annu. ACM Symp. Theory of Computing, Dallas, TX, USA, 1998, pp. 604-613.
[2]
S. Wold, K. Esbensen, and P. Geladi, Principal component analysis, Chemometr. Intell. Lab. Syst., vol. 2, nos. 1-3, pp. 37-52, 1987.
[3]
A. Gelman, J. B. Carlin, H. S. Stern, D. B. Dunson, A. Vehtari, and D. B. Rubin, Bayesian Data Analysis. 3rd ed. Boca Raton, FL, USA: CRC Press, 2013.
[4]
M. Dax, S. R. Green, J. Gair, J. H. Macke, A. Buonanno, and B. Schölkopf, Real-time gravitational-wave science with neural posterior estimation, arXiv preprint arXiv: 2106.12594, 2021.
[5]
J. Alsing and W. Handley, Nested sampling with any prior you like, Mon. Not. Roy. Astron. Soc.: Lett., vol. 505, no. 1, pp. L95-L99, 2021.
[6]
J. Hammersley, Monte Carlo Methods. Dordrecht, the Netherlands: Springer, 1964.
[7]
H. Gabbard, C. Messenger, I. S. Heng, F. Tonolini, and R. Murray-Smith, Bayesian parameter estimation using conditional variational autoencoders for gravitational-wave astronomy, arXiv preprint arXiv: 1909.06296, 2020.
[8]
S. R. Green, C. Simpson, and J. Gair, Gravitational-wave parameter estimation with autoregressive neural network flows, Phys. Rev. D, vol. 102, no. 10, p. 104057, 2020.
[9]
A. Delaunoy, A. Wehenkel, T. Hinderer, S. Nissanke, C. Weniger, A. R. Williamson, and G. Louppe, Lightning-fast gravitational wave parameter inference through neural amortization, arXiv preprint arXiv: 2010.12931, 2020.
[10]
S. R. Green and J. Gair, Complete parameter inference for GW150914 using deep learning, Mach. Learn.: Sci. Technol., vol. 2, no. 3, p. 03LT01, 2021.
[11]
J. Veitch, V. Raymond, B. Farr, W. Farr, P. Graff, S. Vitale, B. Aylott, K. Blackburn, N. Christensen, M. Coughlin, et al., Parameter estimation for compact binaries with ground-based gravitational-wave observations using the LALInference software library, Phys. Rev. D, vol. 91, no. 4, p. 042003, 2015.
[12]
G. E. Batista, A. L. C. Bazzan, and M. C. Monard, Balancing training data for automated annotation of keywords: A case study, in Proc. of II Brazilian Workshop on Bioinformatics, Macaé, Brazil, 2003, pp. 10-18.
[13]
G. E. A. P. A. Batista, R. C. Prati, and M. C. Monard, A study of the behavior of several methods for balancing machine learning training data, ACM SIGKDD Explor. Newsl., vol. 6, no. 1, pp. 20-29, 2004.
[14]
G. Lemaître, F. Nogueira, and C. K. Aridas, Imbalanced-learn: A python toolbox to tackle the curse of imbalanced datasets in machine learning, J. Mach. Learn. Res., vol. 18, no. 1, pp. 559-563, 2017.
[15]
N. V. Chawla, K. W. Bowyer, L. O. Hall, and W. P. Kegelmeyer, SMOTE: Synthetic minority over-sampling technique, J. Artif. Intell. Res., vol. 16, pp. 321-357, 2002.
[16]
I. Tomek, Two modifications of CNN, IEEE Trans. Syst. Man Cybern., vol. SMC-6, no. 11, pp. 769-772, 1976.
[17]
B. P. Abbott, R. Abbott, T. D. Abbott, S. Abraham, F. Acernese, K. Ackley, C. Adams, V. B. Adya, C. Affeldt, M. Agathos, et al., A guide to LIGO-Virgo detector noise and extraction of transient gravitational-wave signals, Class. Quantum Grav., vol. 37, no. 5, p. 055002, 2020.
[18]
W. H. A. Schilders, H. A. Van der Vorst, and J. Rommes, Model Order Reduction: Theory, Research Aspects and Applications. Berlin, Germany: Springer, 2008.
[19]
P. Canizares, S. E. Field, J. R. Gair, and M. Tiglio, Gravitational wave parameter estimation with compressed likelihood evaluations, Phys. Rev. D, vol. 87, no. 12, p. 124005, 2013.
[20]
R. Smith, S. E. Field, K. Blackburn, C. J. Haster, M. Pürrer, V. Raymond, and P. Schmidt, Fast and accurate inference on gravitational waves from precessing compact binaries, Phys. Rev. D, vol. 94, no. 4, p. 044031, 2016.
[21]
I. Kobyzev, S. J. D. Prince, and M. A. Brubaker, Normalizing flows: An introduction and review of current methods, IEEE Trans. Pattern Anal. Mach. Intell., vol. 43, no. 11, pp. 3964-3979, 2021.
[22]
G. Papamakarios, T. Pavlakou, and I. Murray, Masked autoregressive flow for density estimation, arXiv preprint arXiv: 1705.07057, 2018.
[23]
C. W. Huang, D. Krueger, A. Lacoste, and A. Courville, Neural autoregressive flows, in Proc. 35th Int. Conf. Machine Learning, Stockholm, Sweden, 2018, pp. 2078-2087.
[24]
L. Dinh, J. Sohl-Dickstein, and S. Bengio, Density estimation using real NVP, arXiv preprint arXiv: 1605.08803, 2017.
[25]
D. P. Kingma and P. Dhariwal, Glow: Generative flow with invertible 1×1 convolutions, arXiv preprint arXiv: 1807.03039, 2018.
[26]
C. Durkan, A. Bekasov, I. Murray, and G. Papamakarios, Neural spline flows, in Proc. of the 33rd Conf. Neural Information Processing Systems, Vancouver, Canada, 2019, pp. 7511-7522.
[27]
L. Dinh, D. Krueger, and Y. Bengio, Nice: Non-linear independent components estimation, arXiv preprint arXiv: 1410.8516, 2015.
[28]
T. Müller, B. McWilliams, F. Rousselle, M. Gross, and J. Novák, Neural importance sampling, ACM Trans. Graph., vol. 38, no. 5, p. 145, 2019.
[29]
C. Durkan, A. Bekasov, I. Murray, and G. Papamakarios, Cubic-spline flows, arXiv preprint arXiv: 1906.02145, 2019.
[30]
K. M. He, X. Y. Zhang, S. Q. Ren, and J. Sun, Identity mappings in deep residual networks, in Proc. 14th European Conf. Computer Vision, Amsterdam, the Netherlands, 2016, pp. 630-645.
[31]
B. P. Abbott, R. Abbott, T. D. Abbott, M. R. Abernathy, F. Acernese, K. Ackley, C. Adams, T. Adams, P. Addesso, R. X. Adhikari, et al., GW150914: The advanced LIGO detectors in the era of first discoveries, Phys. Rev. Lett., vol. 116, no. 13, p. 131103, 2016.
[32]
B. Farr, E. Ochsner, W. M. Farr, and R. O’shaughnessy, A more effective coordinate system for parameter estimation of precessing compact binaries from gravitational waves, Phys. Rev. D, vol. 90, no. 2, p. 024018, 2014.
[33]
B. P. Abbott, R. Abbott, T. D. Abbott, S. Abraham, F. Acernese, K. Ackley, C. Adams, R. X. Adhikari, V. B. Adya, C. Affeldt, et al., GWTC-1: A gravitational-wave transient catalog of compact binary mergers observed by LIGO and Virgo during the first and second observing runs, Phys. Rev. X, vol. 9, no. 3, p. 031040, 2019.
[34]
I. M. Romero-Shaw, C. Talbot, S. Biscoveanu, V. D’emilio, G. Ashton, C. P. L. Berry, S. Coughlin, S. Galaudage, C. Hoy, M. Hübner, et al., Bayesian inference for compact binary coalescences with BILBY: Validation and application to the first LIGO-Virgo gravitational-wave transient catalogue, Mon. Not. Roy. Astron. Soc., vol. 499, no. 3, pp. 3295-3319, 2020.
[35]
G. Ashton, M. Hübner, P. D. Lasky, C. Talbot, K. Ackley, S. Biscoveanu, Q. Chu, A. Divakarla, P. J. Easter, B. Goncharov, et al., BILBY: A user-friendly Bayesian inference library for gravitational-wave astronomy, Astrophys. J. Suppl. Ser., vol. 241, no. 2, p. 27, 2019.
[36]
M. Hannam, P. Schmidt, A. Bohé, L. Haegel, S. Husa, F. Ohme, G. Pratten, and M. Pürrer, Simple model of complete precessing black-hole-binary gravitational waveforms, Phys. Rev. Lett., vol. 113, no. 15, p. 151101, 2014.
[37]
S. Khan, S. Husa, M. Hannam, F. Ohme, M. Pürrer, X. J. Forteza, and A. Bohé, Frequency-domain gravitational waves from nonprecessing black-hole binaries. II. A phenomenological model for the advanced detector era, Phys. Rev. D, vol. 93, no. 4, p. 044007, 2016.
[38]
A. Bohé, M. Hannam, S. Husa, F. Ohme, M. Pürrer, and P. Schmidt, PhenomPv2-technical notes for the LAL implementation, LIGO Technical Document, LIGO-T1500602-v4, 2016.
[40]
A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, T. Killeen, Z. M. Lin, N. Gimelshein, L. Antiga, et al., Pytorch: An imperative style, high-performance deep learning library, in Proc. 33rd Int. Conf. Neural Information Processing Systems, Vancouver, Canada, 2019, pp. 8026-8037.
[42]
S. Ioffe and C. Szegedy, Batch normalization: Accelerating deep network training by reducing internal covariate shift, in Proc. 32nd Int. Conf. Machine Learning, Lille, France, 2015, pp. 448-456.
[43]
D. A. Clevert, T. Unterthiner, and S. Hochreiter, Fast and accurate deep network learning by exponential linear units (ELUs), arXiv preprint arXiv: 1511.07289, 2016.
[44]
I. Loshchilov and F. Hutter, SGDR: Stochastic gradient descent with warm restarts, arXiv preprint arXiv: 1608.03983, 2017.
[45]
D. P. Kingma and J. Ba, Adam: A method for stochastic optimization, arXiv preprint arXiv: 1412.6980, 2015.
[46]
J. Lin, Divergence measures based on the Shannon entropy, IEEE Trans. Inform. Theory, vol. 37, no. 1, pp. 145-151, 1991.
[47]
S. Brooks, A. Gelman, G. Jones, and X. L. Meng, Handbook of Markov Chain Monte Carlo. Boca Raton, FL, USA: CRC Press, 2011.
[48]
J. Skilling, Nested sampling for general Bayesian computation, Bayesian Anal., vol. 1, no. 4, pp. 833-859, 2006.
[49]
J. Buchner, Nested sampling methods, arXiv preprint arXiv: 2101.09675, 2021.
[50]
B. P. Abbott, R. Abbott, T. D. Abbott, F. Acernese, K. Ackley, C. Adams, T. Adams, P. Addesso, R. X. Adhikari, V. B. Adya, et al., GW170608: Observation of a 19 solar-mass binary black hole coalescence, Astrophys. J. Lett., vol. 851, no. 2, p. L35, 2017.
[51]
D. Foreman-Mackey, corner.py: Scatterplot matrices in Python, J. Open Source Softw., vol. 1, no. 2, p. 24, 2016.
[52]
S. Van Der Walt, S. C. Colbert, and G. Varoquaux, The NumPy array: A structure for efficient numerical computation, Comput. Sci. Eng., vol. 13, no. 2, pp. 22-30, 2011.
[53]
P. Virtanen, R. Gommers, T. E. Oliphant, M. Haberland, T. Reddy, D. Cournapeau, E. Burovski, P. Peterson, W. Weckesser, J. Bright, et al., SciPy 1.0: Fundamental algorithms for scientific computing in Python, Nat. Methods, vol. 17, no. 3, pp. 261-272, 2020.
[54]
W. McKinney, pandas: A foundational Python library for data analysis and statistics, Python for High Performance and Scientific Computing, vol. 14, no. 9, pp. 1-9, 2011.
[55]
J. S. Speagle, dynesty: A dynamic nested sampling package for estimating Bayesian posteriors and evidences, Mon. Not. Roy. Astron. Soc., vol. 493, no. 3, pp. 3132-3158, 2020.
[56]
J. D. Hunter, Matplotlib: A 2D graphics environment, Comput. Sci. Eng., vol. 9, no. 3, pp. 90-95, 2007.