Word-Level Embedding to Improve Performance of Representative Spatio-temporal Document Classification
Byoungwook Kim, Hong-Jun Jang, Journal of Information Processing Systems Vol. 19, No. 6, pp. 830-841, Dec. 2023
https://doi.org/10.3745/JIPS.04.0296
Keywords: Spatio-temporal Document Classification, Tokenization, Word-Level Embedding
Fulltext:
Abstract
Statistics
Show / Hide Statistics
Statistics (Cumulative Counts from November 1st, 2017)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.
Statistics (Cumulative Counts from November 1st, 2017)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.
|
Cite this article
[APA Style]
Kim, B. & Jang, H. (2023). Word-Level Embedding to Improve Performance of Representative Spatio-temporal Document Classification. Journal of Information Processing Systems, 19(6), 830-841. DOI: 10.3745/JIPS.04.0296.
[IEEE Style]
B. Kim and H. Jang, "Word-Level Embedding to Improve Performance of Representative Spatio-temporal Document Classification," Journal of Information Processing Systems, vol. 19, no. 6, pp. 830-841, 2023. DOI: 10.3745/JIPS.04.0296.
[ACM Style]
Byoungwook Kim and Hong-Jun Jang. 2023. Word-Level Embedding to Improve Performance of Representative Spatio-temporal Document Classification. Journal of Information Processing Systems, 19, 6, (2023), 830-841. DOI: 10.3745/JIPS.04.0296.