专栏名称: 轻松参会
回复会议名称获取交流群二维码,如“cvpr”
目录
相关文章推荐
51好读  ›  专栏  ›  轻松参会

CCF B类会议EMNLP2024即将截稿(附投稿交流群)

轻松参会  · 公众号  ·  · 2024-04-27 15:14

正文

交流群见文末


会议全称:Conference on Empirical Methods in Natural Language Processing

录用率:2021年23.33%

CCF分级:人工智能B

截稿时间:2024/6/15

录用通知时间:2024/9/20

官网链接:

https://2024.emnlp.org/

征稿范围:

EMNLP 2024 aims to have a broad technical program. Relevant topics  for the conference include, but are not limited to, the following areas:

  • Computational Social Science and Cultural Analytics

  • Dialogue and Interactive Systems

  • Discourse and Pragmatics

  • Low-resource Methods for NLP

  • Ethics, Bias, and Fairness

  • Generation

  • Information Extraction

  • Information Retrieval and Text Mining

  • Interpretability and Analysis of Models for NLP

  • Linguistic theories, Cognitive Modeling and Psycholinguistics

  • Machine Learning for NLP

  • Machine Translation

  • Multilinguality and Language Diversity

  • Multimodality and Language Grounding to Vision, Robotics and Beyond

  • Phonology, Morphology and Word Segmentation

  • Question Answering

  • Resources and Evaluation

  • Semantics: Lexical, Sentence-level Semantics, Textual Inference and Other areas

  • Sentiment Analysis, Stylistic Analysis, and Argument Mining

  • Speech processing and spoken language understanding

  • Summarization

  • Syntax: Tagging, Chunking and Parsing

  • NLP Applications

  • Special Theme: Efficiency in Model Algorithms, Training, and Inference

EMNLP 2024 Theme Track: Efficiency in Model Algorithms, Training, and Inference

This track provides a platform for researchers to explore key aspects  of making model algorithms, training, and inference more efficient,  e.g., quantization,  data requirements, and model size. We welcome   submissions that propose innovative approaches, methodologies, and   techniques to streamline the training and inference process for language  models while optimizing resource utilization and reducing model size. Authors are encouraged to explore various ways to enhance efficiency,   including parameter-efficient tuning and methods for learning with less data and smaller model sizes, ultimately leading to more scalable,   practical, and resource-efficient NLP systems.







请到「今天看啥」查看全文