AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
PDF (1.7 MB)
Collect
Submit Manuscript AI Chat Paper
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Open Access

Statistical Learning for Semantic Parsing: A Survey

National Science Foundation Center for Big Learning, University of Florida, Gainesville, FL 32608, USA.
Show Author Information

Abstract

A long-term goal of Artificial Intelligence (AI) is to provide machines with the capability of understanding natural language. Understanding natural language may be referred as the system must produce a correct response to the received input order. This response can be a robot move, an answer to a question, etc. One way to achieve this goal is semantic parsing. It parses utterances into semantic representations called logical form, a representation of many important linguistic phenomena that can be understood by machines. Semantic parsing is a fundamental problem in natural language understanding area. In recent years, researchers have made tremendous progress in this field. In this paper, we review recent algorithms for semantic parsing including both conventional machine learning approaches and deep learning approaches. We first give an overview of a semantic parsing system, then we summary a general way to do semantic parsing in statistical learning. With the rise of deep learning, we will pay more attention on the deep learning based semantic parsing, especially for the application of Knowledge Base Question Answering (KBQA). At last, we survey several benchmarks for KBQA.

References

[1]
P. Pasupat and P. Liang, Compositional semantic parsing on semi-structured tables, in Proc. 53rd Ann. Meeting of the Association for Computational Linguistics and the 7th Int. Joint Conf. Natural Language Processing, Beijing, China, 2015.
[2]
P. Liang, Learning executable semantic parsers for natural language understanding, Commun. ACM, vol. 59, no. 9, pp. 68-76, 2016.
[3]
W. Woods, R. Kaplan, and B. Webber, The Lunar Sciences Natural Language Information System: Final Report. Cambridge, MA, USA: Bolt Beranek and Newman Inc., 1972.
[4]
T. Winograd, Understanding Natural Language. New York, NY, USA: Academic Press, 1972.
[5]
J. Clarke, D. Goldwasser, M. W. Chang, and D. Roth, Driving semantic parsing from the world’s response, in Proc. 14th Conf. Computational Natural Language Learning, Uppsala, Sweden, 2010, pp. 18-27.
[6]
P. Liang, M. I. Jordan, and D. Klein, Learning dependency-based compositional semantics, in Proc. 49th Ann. Meeting of the Association for Computational Linguistics: Human Language Technologies-Volume 1, Portland, OR, USA, 2011, pp. 590-599.
[7]
K. Bollacker, C. Evans, P. Paritosh, T. Sturge, and J. Taylor, Freebase: A collaboratively created graph database for structuring human knowledge, in Proc. 2008 ACM SIGMOD Int. Conf. Management of Data, Vancouver, Canada, 2008, pp. 1247-1250.
[8]
D. Vrandecic and M. Krötzsch, Wikidata: A free collaborative knowledgebase, Commun. ACM, vol. 57, no. 10, pp. 78-85, 2014.
[9]
S. Tellex, T. Kollar, S. Dickerson, M. R. Walter, A. G. Banerjee, S. Teller, and N. Roy, Understanding natural language commands for robotic navigation and mobile manipulation, in Proc. 25th AAAI Conf. Artificial Intelligence, San Francisco, CA, USA, 2011.
[10]
Y. Artzi and L. Zettlemoyer, Weakly supervised learning of semantic parsers for mapping instructions to actions, Trans. Assoc. Comput. Linguist., vol. 1, pp. 49-62, 2013.
[11]
C. Matuszek, N. FitzGerald, L. Zettlemoyer, L. Bo, and D. Fox, A joint model of language and perception for grounded attribute learning, in Proc. 29th Int. Conf. Machine Learning, Edinburgh, UK, 2012.
[12]
J. Krishnamurthy and T. Kollar, Jointly learning to parse and perceive: Connecting natural language to the physical world, Trans. Assoc. Comput. Linguist., vol. 1, pp. 193-206, 2013.
[13]
N. Kushman and R. Barzilay, Using semantic unification to generate regular expressions from natural language, in Proc. 2013 Conf. North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Atlanta, GA, USA, 2013.
[14]
Y. LeCun, Y. Bengio, and G. Hinton, Deep learning, Nature, vol. 521, no. 7553, pp. 436-444, 2015.
[15]
G. Hinton, L. Deng, D. Yu, G. E. Dahl, A.R. Mohamed, N. Jaitly, A. Senior, V. Vanhoucke, P. Nguyen, T. N. Sainath, et al., Deep neural networks for acoustic modeling in speech recognition: The shared views of four research groups, IEEE Signal Process. Mag., vol. 29, no. 6, pp. 82-97, 2012.
[16]
K. M. He, X. Y. Zhang, S. Q. Ren, and J. Sun, Deep residual learning for image recognition, in Proc. 2016 IEEE Conf. Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 2016, pp. 770-778.
[17]
Y. Li, H. Z. Qi, J. F. Dai, X. Y. Ji, and Y. C. Wei, Fully convolutional instance-aware semantic segmentation, in Proc. 2017 IEEE Conf. Computer Vision and Pattern Recognition, Honolulu, HI, USA, 2017.
[18]
X. Z. Ma and E. Hovy, End-to-end sequence labeling via Bi-directional LSTM-CNNs-CRF, arXiv preprint arXiv:1603.01354, 2016.
[19]
G. Lample, M. Ballesteros, S. Subramanian, K. Kawakami, and C. Dyer, Neural architectures for named entity recognition, in Proc. 2016 Conf. North American Chapter of the Association for Computational Linguistics: Human Language Technologies, San Diego, CA, USA, 2016.
[20]
Q. L. Zhu, X. L. Li, A. Conesa, and P. Cecile, GRAM-CNN: A deep learning approach with local context for named entity recognition in biomedical text, Bioinformatics, vol. 34, no. 9, pp. 1547-1554, 2018.
[21]
R. M. Sun, X. Y. Yuan, P. He, Q. L. Zhu, A. K. Chen, A. Gregio, D. Oliveira, and X. L. Li, Learning fast and slow: Propaedeutica for real-time malware detection, arXiv preprint arXiv:1712.01145, 2017.
[22]
R. Collobert, J. Weston, L. Bottou, M. Karlen, K. Kavukcuoglu, and P. Kuksa, Natural language processing (almost) from scratch, J. Mach. Learn. Res., vol. 12, pp. 2493-2537, 2011.
[23]
E. F. T. K. Sang and F. De Meulder, Introduction to the CoNLL-2003 shared task: Language-independent named entity recognition, in Proc. 7th Conf. Natural Language Learning at HLT-NAACL 2003-Volume 4, Edmonton, Canada, 2003, pp. 142-147.
[24]
M. P. Marcus, M. A. Marcinkiewicz, and B. Santorini, Building a large annotated corpus of English: The Penn Treebank, Comput. Linguist., vol. 19, no. 2, pp. 313-330, 1993.
[25]
D. K. Misra and Y. Artzi, Neural shift-reduce CCG semantic parsing, in Proc. 2016 Conf. Empirical Methods in Natural Language Processing, Austin, TX, USA, 2016, pp. 1775-1786.
[26]
J. M. Zelle and R. J. Mooney, Learning to parse database queries using inductive logic programming, in Proc. 13th National Conf. Artificial Intelligence, Portland, OR, USA, 1996, pp. 1050-1055.
[27]
Y. W. Wong and R. J. Mooney, Learning for semantic parsing with statistical machine translation, in Proc. Main Conf. Human Language Technology Conf. North American Chapter of the Association of Computational Linguistics, New York, NY, USA, 2006, pp. 439-446.
[28]
L. S. Zettlemoyer and M. Collins, Learning to map sentences to logical form: Structured classification with probabilistic categorial grammars, in Proc. 21st Conf. Uncertainty in Artificial Intelligence, Edinburgh, UK, 2005.
[29]
M. Steedman, The Syntactic Process. Cambridge, MA, USA: MIT Press, 2000.
[30]
Y. Artzi, K. Lee, and L. Zettlemoyer, Broad-coverage CCG semantic parsing with AMR, in Proc. 2015 Conf. Empirical Methods in Natural Language Processing, Lisbon, Portugal, 2015, pp. 1699-1710.
[31]
L. S. Zettlemoyer and M. Collins, Online learning of relaxed CCG grammars for parsing to logical form, in Proc. Joint Conf. Empirical Methods in Natural Language Processing and Computational Natural Language Learning, Prague, Czech Republic, 2007, pp. 678-687.
[32]
T. Kwiatkowski, L. Zettlemoyer, S. Goldwater, and M. Steedman, Inducing probabilistic CCG grammars from logical form with higher-order unification, in Proc. 2010 Conf. Empirical Methods in Natural Language Processing, Cambridge, MA, USA, 2010, pp. 1223-1233.
[33]
P. Liang, Lambda dependency-based compositional semantics, arXiv preprint arXiv:1309.4408, 2013.
[34]
Z. C. Zheng, F. T. Li, M. L. Huang, and X. Y. Zhu, Learning to link entities with knowledge base, in Human Language Technologies: The 2010 Ann. Conf. North American Chapter of the Association for Computational Linguistics, Los Angeles, CA, USA, 2010, pp. 483-491.
[35]
Y. Yang and M. W. Chang, S-mart: Novel tree-based structured learning algorithms applied to tweet entity linking, in Proc. Association for Computational Linguistics, Beijing, China, 2015.
[36]
R. C. Schank and L. Tesler, A conceptual dependency parser for natural language, in Proc. 1969 Conf. Computational Linguistics, Sång-Säby, Sweden, 1969, pp. 1-3.
[37]
K. Zhao and L. Huang, Type-driven incremental semantic parsing with polymorphism, in Proc. Human Language Technologies: The Ann. Conf. North American Chapter of the ACL, Denver, CO, USA, 2015.
[38]
R. S. Sutton and A. G. Barto, Reinforcement Learning: An Introduction. Cambridge, MA, USA: MIT Press, 1998.
[39]
M. Collins, Discriminative training methods for hidden Markov models: Theory and experiments with perceptron algorithms, in Proc. ACL-02 Conf. Empirical Methods in Natural Language Processing-Volume 10, Stroudsburg, PA, USA, 2002, pp. 1-8.
[40]
F. Rosenblatt, The Perceptron: A Perceiving and Recognizing Automation. Buffalo, NY, USA: Cornell Aeronautical Laboratory, 1957.
[41]
Y. Freund and R. E. Schapire, Large margin classification using the perceptron algorithm, Mach. Learn., vol. 37, no. 3, pp. 277-296, 1999.
[42]
K. Zhao, Structured Prediction with Perceptron: Theory and Algorithms, New York, NY, USA: The City University of New York, 2014.
[43]
P. Liang, A. Bouchard-Côté, D. Klein, and B. Taskar, An end-to-end discriminative approach to machine translation, in Proc. 21st Int. Conf. Computational Linguistics and the 44th Ann. Meeting of the Association for Computational Linguistics, Sydney, Australia, 2006, pp. 761-768.
[44]
N. Singh-Miller and M. Collins, Trigger-based language modeling using a loss-sensitive perceptron algorithm, in Proc. 2007 IEEE Int. Conf. Acoustics, Speech and Signal Processing, Honolulu, HI, USA, 2007, pp. 25-28.
[45]
Y. Artzi and L. Zettlemoyer, Bootstrapping semantic parsers from conversations, in Proc. 2011 Conf. Empirical Methods in Natural Language Processing, Edinburgh, UK, 2011, pp. 421-432.
[46]
D. L. Chen and R. J. Mooney, Learning to interpret natural language navigation instructions from observations, in Proc. 25th AAAI Conf. Artificial Intelligence, San Francisco, CA, USA, 2011, pp. 859-865.
[47]
Q. Q. Cai and A. Yates, Large-scale semantic parsing via schema matching and lexicon extension, in Proc. 51st Ann. Meeting of the Association for Computational Linguistics, Sofia, Bulgaria, 2013, pp. 423-433.
[48]
T. Kwiatkowski, E. Choi, Y. Artzi, and L. Zettlemoyer, Scaling semantic parsers with on-the-fly ontology matching, in Proc. 2013 Conf. Empirical Methods in Natural Language Processing, Seattle, WA, USA, 2013.
[49]
J. Berant, A. Chou, R. Frostig, and P. Liang, Semantic parsing on freebase from question-answer pairs, in Proc. 2013 Conf. Empirical Methods in Natural Language Processing, Seattle, WA, USA, 2013.
[50]
J. Berant and P. Liang, Semantic parsing via paraphrasing, in Proc. 52nd Ann. Meeting of the Association for Computational Linguistics, Baltimore, MA, USA, 2014, pp. 1415-1425.
[51]
T. Mikolov, I. Sutskever, K. Chen, G. Corrado, and J. Dean, Distributed representations of words and phrases and their compositionality, in Proc. 26th Int. Conf. Neural Information Processing Systems, Lake Tahoe, NV, USA, 2013, pp. 3111-3119.
[52]
Y. LeCun, B. Boser, J. S. Denker, D. Henderson, R. E. Howard, W. Hubbard, and L. D. Jackel, Backpropagation applied to handwritten zip code recognition, Neural Comput., vol. 1, no. 4, pp. 541-551, 1989.
[53]
I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning. Cambridge, MA, USA: MIT Press, 2016.
[54]
Y. T. Zhou, R. Chellappa, A. Vaid, and B. K. Jenkins, Image restoration using a neural network, IEEE Trans. Acoust. Speech Signal Process., vol. 36, no. 7, pp. 1141-1151, 1988.
[55]
M. Schuster and K. K. Paliwal, Bidirectional recurrent neural networks, IEEE Trans. Signal Process., vol. 45, no. 11, pp. 2673-2681, 1997.
[56]
S. Hochreiter, Untersuchungen zu dynamischen neuronalen netzen, Master dissertation, Technische Universitaet München, Munich, Germany, 1991.
[57]
Y. Bengio, P. Simard, and P. Frasconi, Learning long-term dependencies with gradient descent is difficult, IEEE Trans. Neural Netw., vol. 5, no. 2, pp. 157-166, 1994.
[58]
S. Hochreiter and J. Schmidhuber, Long short-term memory, Neural Comput., vol. 9, no. 8, pp. 1735-1780, 1997.
[59]
K. Cho, B. van Merriënboer, C. Gulcehre, D. Bahdanau, F. Bougares, H. Schwenk, and Y. Bengio, Learning phrase representations using RNN encoder-decoder for statistical machine translation, in Proc. 2014 Conf. Empirical Methods in Natural Language Processing, Doha, Qatar, 2014.
[60]
S. W. T. Yih, M. W. Chang, X. D. He, and J. F. Gao, Semantic parsing via staged query graph generation: Question answering with knowledge base, in Proc. 53rd Ann. Meeting of the Association for Computational Linguistics and the 7th Int. Joint Conf. Natural Language Processing, Beijing, China, 2015, pp. 1321-1331.
[61]
A. Bordes, S. Chopra, and J. Weston, Question answering with subgraph embeddings, in Proc. 2014 Conf. Empirical Methods in Natural Language Processing, Doha, Qatar, 2014.
[62]
J. Bao, N. Duan, M. Zhou, and T. Zhao, Knowledge-based question answering as machine translation, in Proc. 52nd Ann. Meeting of the Association for Computational Linguistics, Baltimore, MD, USA, 2014.
[63]
J. Bromley, I. Guyon, Y. LeCun, E. Säckinger, and R. Shah, Signature verification using a "siamese" time delay neural network, in Proc. 6th Int. Conf. Neural Information Processing Systems, Denver, CO, USA, 1993, pp. 737-744.
[64]
R. Jia and P. Liang, Data recombination for neural semantic parsing, in Proc. 54th Ann. Meeting of the Association for Computational Linguistics, Berlin, Germany, 2016, pp. 12-22.
[65]
D. Bahdanau, K. Cho, and Y. Bengio, Neural machine translation by jointly learning to align and translate, in Proc. Int. Conf. Learning Representations, San Diego, CA, USA, 2015.
[66]
M. T. Luong, H. Pham, and C. D. Manning, Effective approaches to attention-based neural machine translation, in Proc. 2015 Conf. Empirical Methods in Natural Language Processing, Lisbon, Portugal, 2015.
[67]
O. Vinyals, M. Fortunato, and N. Jaitly, Pointer networks, in Proc. 28th Int. Conf. Neural Information Processing Systems, Montreal, Canada, 2015, pp. 2692-2700.
[68]
A. Krizhevsky, I. Sutskever, and G. E. Hinton, Imagenet classification with deep convolutional neural networks, in Proc. 25th Int. Conf. Neural Information Processing Systems, Lake Tahoe, NV, USA, 2012, pp. 1097-1105.
[69]
N. Jaitly and G. E. Hinton, Vocal Tract Length Perturbation (VTLP) improves speech recognition, in Proc. 30th Int. Conf. Machine Learning, Atlanta, GA, USA, 2013.
[70]
L. Dong and M. Lapata, Language to logical form with neural attention, in Proc. 54th Ann. Meeting of the Association for Computational Linguistics, Berlin, Germany, 2016.
[71]
C. Liang, J. Berant, Q. Le, K. D. Forbus, and N. Lao, Neural symbolic machines: Learning semantic parsers on freebase with weak supervision, in Proc. 55th Ann. Meeting of the Association for Computational Linguistics, Vancouver, Canada, 2017, pp. 23-33.
[72]
R. J. Williams, Simple statistical gradient-following algorithms for connectionist reinforcement learning, Mach. Learn., vol. 8, nos. 3&4, pp. 229-256, 1992.
[73]
J. P. Cheng, S. Reddy, V. Saraswat, and M. Lapata, Learning structured natural language representations for semantic parsing, in Proc. 55th Ann. Meeting of the Association for Computational Linguistics, Vancouver, Canada, 2017, pp. 44-55.
[74]
S. Reddy, M. Lapata, and M. Steedman, Large-scale semantic parsing without question-answer pairs, Trans. Assoc. Comput. Linguist., vol. 2, pp. 377-392, 2014.
[75]
S. Reddy, O. Täckström, M. Collins, T. Kwiatkowski, D. Das, M. Steedman, and M. Lapata, Transforming dependency structures to logical forms for semantic parsing, Trans. Assoc. Comput. Linguist., vol. 4, pp. 127-140, 2016.
[76]
R. J. Kate, Y. W. Wong, and R. J. Mooney, Learning to transform natural to formal languages, in Proc. 20th National Conf. Artificial Intelligence, Pittsburgh, PA, USA, 2005, pp. 1062-1068.
[77]
C. Dyer, A. Kuncoro, M. Ballesteros, and N. A. Smith, Recurrent neural network grammars, in Proc. NAACL-HLT, San Diego, CA, USA, 2016.
[78]
C. Dyer, M. Ballesteros, W. Ling, A. Matthews, and N. A. Smith, Transition-based dependency parsing with stack long short-term memory, in Proc. 53rd Ann. Meeting of the Association for Computational Linguistics and the 7th Int. Joint Conf. Natural Language Processing, Beijing, China, 2015.
[79]
Y. Su and X. F. Yan, Cross-domain semantic parsing via paraphrasing, in Proc. 2017 Conf. Empirical Methods in Natural Language Processing, Copenhagen, Denmark, 2017.
[80]
J. Herzig and J. Berant, Neural semantic parsing over multiple knowledge-bases, in Proc. 55th Ann. Meeting of the Association for Computational Linguistics, Vancouver, Canada, 2017.
[81]
M. Johnson, M. Schuster, Q. V. Le, M. Krikun, Y. H. Wu, Z. F. Chen, N. Thorat, F. Viégas, M. Wattenberg, G. Corrado, et al., Google’s multilingual neural machine translation system: Enabling zero-shot translation, Trans. Assoc. Comput. Linguist., vol. 5, pp. 339-351, 2017.
[82]
M. Yu, W. P. Yin, K. S. Hasan, C. dos Santos, B. Xiang, and B. W. Zhou, Improved neural relation detection for knowledge base question answering, in Proc. 55th Ann. Meeting of the Association for Computational Linguistics, Vancouver, Canada, 2017, pp. 571-581.
[83]
R. Das, M. Zaheer, S. Reddy, and A. McCallum, Question answering on knowledge bases and text using universal schema and memory networks, in Proc. 55th Ann. Meeting of the Association for Computational Linguistics, Vancouver, Canada, 2017.
[84]
J. Andreas, M. Rohrbach, T. Darrell, and D. Klein, Learning to compose neural networks for question answering, in Proc. NAACL-HLT, San Diego, CA, USA, 2016.
[85]
L. R. Tang and R. J. Mooney, Using multiple clause constructors in inductive logic programming for semantic parsing, in Proc. 12th European Conf. Machine Learning, Freiburg, Germany, 2001, pp. 466-477.
[86]
W. T. Yih, M. Richardson, C. Meek, M. W. Chang, and J. Suh, The value of semantic parse labeling for knowledge base question answering, in Proc. 54th Ann. Meeting of the Association for Computational Linguistics, 2016, pp. 201-206.
[87]
Y. Bisk, S. Reddy, J. Blitzer, J. Hockenmaier, and M. Steedman, Evaluating induced CCG parsers on grounded semantic parsing, in Proc. 2016 Conf. Empirical Methods in Natural Language Processing, Austin, TX, USA, 2016.
[88]
E. Gabrilovich, M. Ringgaard, and A. Subramanya, Facc1: Freebase Annotation of ClueWeb Corpora. Google Inc., 2013.
[89]
A. Bordes, N. Usunier, S. Chopra, and J. Weston, Large-scale simple question answering with memory networks, arXiv preprint arXiv:1506.02075, 2015.
[90]
Y. S. Wang, J. Berant, and P. Liang, Building a semantic parser overnight, in Proc. 53rd Ann. Meeting of the Association for Computational Linguistics and the 7th Int. Joint Conf. Natural Language Processing, Beijing, China, 2015, pp. 1332-1342.
[91]
C. Quirk, R. Mooney, and M. Galley, Language to code: Learning semantic parsers for if-this-then-that recipes, in Proc. 53rd Ann. Meeting of the Association for Computational Linguistics and the 7th Int. Joint Conf. Natural Language Processing, Beijing, China, 2015, pp. 878-888.
[92]
A. Bapna, G. Tür, D. Hakkani-Tür, and L. Heck, Towards zero-shot frame semantic parsing for domain scaling, in Proc. INTERSPEECH, Stockholm, Sweden, 2017.
[93]
D. Hakkani-Tür, G. Tür, A. Celikyilmaz, Y. N. V. Chen, J. F. Gao, L. Deng, and Y. Y. Wang, Multi-domain joint semantic frame parsing using bi-directional RNN-LSTM, in Proc. 17th Ann. Meeting of the Int. Speech Communication Association, San Francisco, CA, USA, 2016, pp. 715-719.
[94]
A. Jaech, L. Heck, and M. Ostendorf, Domain adaptation of recurrent neural networks for natural language understanding, in Proc. 17th Ann. Meeting of the Int. Speech Communication Association, San Francisco, CA, USA, 2016.
[95]
X. Fan, E. Monti, L. Mathias, and M. Dreyer, Transfer learning for neural semantic parsing, in Proc. 2nd Workshop on Representation Learning for NLP, Vancouver, Canada, 2017.
[96]
D. L. Silver, Q. Yang, and L. H. Li, Lifelong machine learning systems: Beyond learning algorithms, in AAAI Spring Symp.: Lifelong Machine Learning, Palo Alto, CA, USA, 2013, p. 5.
[97]
Y. N. Dauphin, G. Tür, D. Hakkani-Tür, and L. P. Heck, Zero-shot learning and clustering for semantic utterance classification, in Proc. 2nd Int. Conf. Learning Representations, Banff, Canada, 2014.
[98]
C. Finn, P. Abbeel, and S. Levine, Model-agnostic meta-learning for fast adaptation of deep networks, in Proc. 34th Int. Conf. Machine Learning, Sydney, Australia, 2017, pp. 1126-1135.
[99]
P. S. Huang, C. L. Wang, R. Singh, W. T. Yih, and X. D. He, Natural language to structured query generation via meta-learning, in Proc. NAACL-HLT 2018, New Orleans, LA, USA, 2018.
Big Data Mining and Analytics
Pages 217-239
Cite this article:
Zhu Q, Ma X, Li X. Statistical Learning for Semantic Parsing: A Survey. Big Data Mining and Analytics, 2019, 2(4): 217-239. https://doi.org/10.26599/BDMA.2019.9020011

1176

Views

38

Downloads

6

Crossref

8

Web of Science

12

Scopus

0

CSCD

Altmetrics

Received: 17 September 2018
Accepted: 29 April 2019
Published: 05 August 2019
© The author(s) 2019

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return