This year's speakers of ICML and KDD living in Japan will be gathered and pre-conference sessions will be held together with presentation practice. The presentation will be in English and presentation format follows the conferences' style.
機械学習・データマイニングのトップ国際会議である International Conference on Machine Learning (ICML) と ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD) の今年の登壇者を集めて、発表練習を兼ねたプレ発表会を行います。 LAPRAS からもリサーチャー/アルゴリズムエンジニアの鈴木がICML 2019での発表内容を紹介いたします。
日時 / Date and Time
Jun 3rd, 2019, 19:00-22:00
対象 / Participants
- Presenter: ICML 2019 / KDD 2019 Speakers
- Listener: 発表内容に興味のある方。特に質問をしてくださる方を大歓迎。Anyone is welcome.
場所 / Place
LIVING by LAPRAS
2F, Dogenzaka-Sky Bldg., 28-1 Maruyama-cho, Shibuya, Tokyo
スケジュール (仮) / Schedule (Tentative)
|18:45||受付開始 / Start Accepting|
|19:00||はじめに・諸注意 / Opening|
|19:10||Oral Presentation 1|
|19:15||Oral Presentation 2|
|19:20||Oral Presentation 3|
|19:25||Oral Presentation 4|
|19:30||(Oral Presentation 5)|
|19:35||(Oral Presentation 6)|
|20:40||懇親会 / Social Time|
|21:30||終了 / Close|
登壇者 (順不同; 敬称略) / Presenters (Random order)
鈴木 亮太 / Ryota Suzuki (LAPRAS Inc.)
Researcher / Algorithm Engineer at LAPRAS Inc. 大手電気メーカーの研究所を経て2018年11月にLAPRAS(旧scouty)に参画。
Hyperbolic Disk Embeddings for Directed Acyclic Graphs (ICML 2019)
Obtaining continuous representations of structural data such as directed acyclic graphs (DAGs) has gained attention in machine learning and artificial intelligence. However, embedding complex DAGs in which both ancestors and descendants of nodes are exponentially increasing is difficult. Tackling in this problem, we develop Disk Embeddings, which is a framework for embedding DAGs into quasi-metric spaces. Existing state-of-the-art methods, Order Embeddings and Hyperbolic Entailment Cones, are instances of Disk Embedding in Euclidean space and spheres respectively. Furthermore, we propose a novel method Hyperbolic Disk Embeddings to handle exponential growth of relations. The results of our experiments show that our Disk Embedding models outperform existing methods especially in complex DAGs other than trees.
Kaito Fujii / 藤井 海斗 (Univ. Tokyo)
Beyond adaptive submodularity: Approximation guarantees of greedy policy with adaptive submodularity ratio (ICML 2019)
We propose a new concept named adaptive submodularity ratio to study the greedy policy for sequential decision making. While the greedy policy is known to perform well for a wide variety of adaptive stochastic optimization problems in practice, its theoretical properties have been analyzed only for a limited class of problems. We narrow the gap between theory and practice by using adaptive submodularity ratio, which enables us to prove approximation guarantees of the greedy policy for a substantially wider class of problems. Examples of newly analyzed problems include important applications such as adaptive influence maximization and adaptive feature selection. Our adaptive submodularity ratio also provides bounds of adaptivity gaps. Experiments confirm that the greedy policy performs well with the applications being considered compared to standard heuristics.
Metel Michael (RIKEN AIP)
Postdoctoral researcher at RIKEN AIP. Currently focused on stochastic first-order methods with applications in machine learning.
Simple Stochastic Gradient Methods for Non-Smooth Non-Convex Regularized Optimization (ICML2019)
Our work focuses on stochastic gradient methods for optimizing a smooth non-convex loss function with a non-smooth non-convex regularizer. Research on this class of problem is quite limited, and until very recently no non-asymptotic convergence results have been reported. We present two simple stochastic gradient algorithms, for finite-sum and general stochastic optimization problems, which have superior convergence complexities compared to the current state of the art. We also demonstrate our algorithms' better performance in practice for empirical risk minimization on well known datasets.
Shunsuke Kitada / 北田 俊輔 (Hosei Univ.)
法政大学 理工学研究科 応用情報工学専攻 修士課程 2年。 Natural language processing, medical image processing, computational advertisingなど複数分野に渡って研究を進めている。
Conversion Prediction Using Multi-task Conditional Attention Networks to Support the Creation of Effective Ad Creative (KDD2019)
Accurately predicting conversions in advertisements is generally a challenging task, because such conversions do not occur frequently. In this paper, we propose a new framework to support creating high-performing ad creatives, including the accurate prediction of ad creative text conversions before delivering to the consumer. The proposed framework includes three key ideas: multi-task learning, conditional attention, and attention highlighting. Multi-task learning is an idea for improving the prediction accuracy of conversion, which predicts clicks and conversions simultaneously, to solve the difficulty of data imbalance. Furthermore, conditional attention focuses attention of each ad creative with the consideration of its genre and target gender, thus improving conversion prediction accuracy. Attention highlighting visualizes important words and/or phrases based on conditional attention. We evaluated the proposed framework with actual delivery history data (14,000 creatives displayed more than a certain number of times from Gunosy Inc.), and confirmed that these ideas improve the prediction performance of conversions, and visualize noteworthy words according to the creatives' attributes.
- 当日の様子は写真を撮ってFacebookなどのSNSに公開を予定しています。 問題がある方は、事前にその旨ご連絡ください。
Media View all Media
If you add event media, up to 3 items will be shown here.