JUCS - Journal of Universal Computer Science 24(10): 1378-1402, doi: 10.3217/jucs-024-10-1378
Learning Concept Embeddings from Temporal Data
expand article infoFrancois Meyer, Brink van der Merwe, Dirko Coetsee§
‡ Stellenbosch University,, Stellenbosch, South Africa§ Praelexis, Stellenbosch, South Africa
Open Access
Word embedding techniques can be used to learn vector representations of concepts from temporal datasets. Previous attempts to do this amounted to appling word embedding techniques to event sequences. We propose a concept embedding model that extends existing word embedding techniques to take time into account by explicitly modelling the time between concept occurrences. The model is implemented and evaluated using medical temporal data. It is found that incorporating time into the learning algorithm can improve the quality of the resulting embeddings, as measured by an existing methodological framework for evaluating medical concept embeddings.
deep learning, natural language processing, word embeddings, temporal data, skip-gram