お知らせ 【技術コミュニティ運営者の皆さま】成長し続けるエンジニアを支援する「Forkwell」と「connpass」が連携し、connpass上でイベントを開催する技術コミュニティを2020年3月末まで支援いたします。詳しくはこちら by Forkwell

このエントリーをはてなブックマークに追加

6月

3

ICML/KDD 2019 Pre-conference session

ICML / KDD 2019 プレ発表会

Organizing : LAPRAS株式会社

Registration info

発表者枠(ICML/KDD 2019登壇者)

Free

FCFS
0/2

一般参加者

Free

FCFS
39/50

Description

概要

This year's speakers of ICML and KDD living in Japan will be gathered and pre-conference sessions will be held together with presentation practice. The presentation will be in English and presentation format follows the conferences' style.

機械学習・データマイニングのトップ国際会議である International Conference on Machine Learning (ICML) と ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD) の今年の登壇者を集めて、発表練習を兼ねたプレ発表会を行います。 LAPRAS からもリサーチャー/アルゴリズムエンジニアの鈴木がICML 2019での発表内容を紹介いたします。

発表に興味のある方、質問をしてくださる方、聴講のみでも大歓迎です。マサカリも歓迎。

※ 現地での発表を想定し、発表は英語となります。

また、追加の発表者を2名ほど募集しています。 本番前の練習・交流の場として参加いただける方はぜひ発表者枠でご登録ください。

当日のご案内

日時 / Date and Time

2019年6月3日(木) 19:00〜22:00
Jun 3rd, 2019, 19:00-22:00

対象 / Participants

  • Presenter: ICML 2019 / KDD 2019 Speakers
  • Listener: 発表内容に興味のある方。特に質問をしてくださる方を大歓迎。Anyone is welcome.

場所 / Place

LIVING by LAPRAS

東京都渋谷区円山町28-1 渋谷道玄坂スカイビル2F
2F, Dogenzaka-Sky Bldg., 28-1 Maruyama-cho, Shibuya, Tokyo
https://goo.gl/maps/okbsbRnAkGTuVpau9
※5月から新オフィスとなりました。ご注意ください。

スケジュール (仮) / Schedule (Tentative)

時間 内容
18:45 受付開始 / Start Accepting
19:00 はじめに・諸注意 / Opening
19:10 Oral Presentation 1
19:15 Oral Presentation 2
19:20 Oral Presentation 3
19:25 Oral Presentation 4
19:30 (Oral Presentation 5)
19:35 (Oral Presentation 6)
19:40 Poster Presentation
20:40 懇親会 / Social Time
21:30 終了 / Close

登壇者 (順不同; 敬称略) / Presenters (Random order)

鈴木 亮太 / Ryota Suzuki (LAPRAS Inc.)

Twitter | GitHub | Blog

Researcher / Algorithm Engineer at LAPRAS Inc. 大手電気メーカーの研究所を経て2018年11月にLAPRAS(旧scouty)に参画。

Title

Hyperbolic Disk Embeddings for Directed Acyclic Graphs (ICML 2019)

Abstract

Obtaining continuous representations of structural data such as directed acyclic graphs (DAGs) has gained attention in machine learning and artificial intelligence. However, embedding complex DAGs in which both ancestors and descendants of nodes are exponentially increasing is difficult. Tackling in this problem, we develop Disk Embeddings, which is a framework for embedding DAGs into quasi-metric spaces. Existing state-of-the-art methods, Order Embeddings and Hyperbolic Entailment Cones, are instances of Disk Embedding in Euclidean space and spheres respectively. Furthermore, we propose a novel method Hyperbolic Disk Embeddings to handle exponential growth of relations. The results of our experiments show that our Disk Embedding models outperform existing methods especially in complex DAGs other than trees.

Kaito Fujii / 藤井 海斗 (Univ. Tokyo)

Homepage

東京大学大学院情報理工学系研究科数理情報学専攻博士後期課程3年。 専門分野は組合せ最適化と機械学習。

Title

Beyond adaptive submodularity: Approximation guarantees of greedy policy with adaptive submodularity ratio (ICML 2019)

Abstract

We propose a new concept named adaptive submodularity ratio to study the greedy policy for sequential decision making. While the greedy policy is known to perform well for a wide variety of adaptive stochastic optimization problems in practice, its theoretical properties have been analyzed only for a limited class of problems. We narrow the gap between theory and practice by using adaptive submodularity ratio, which enables us to prove approximation guarantees of the greedy policy for a substantially wider class of problems. Examples of newly analyzed problems include important applications such as adaptive influence maximization and adaptive feature selection. Our adaptive submodularity ratio also provides bounds of adaptivity gaps. Experiments confirm that the greedy policy performs well with the applications being considered compared to standard heuristics.

Metel Michael (RIKEN AIP)

Website

Postdoctoral researcher at RIKEN AIP. Currently focused on stochastic first-order methods with applications in machine learning.

Title

Simple Stochastic Gradient Methods for Non-Smooth Non-Convex Regularized Optimization (ICML2019)

Abstract

Our work focuses on stochastic gradient methods for optimizing a smooth non-convex loss function with a non-smooth non-convex regularizer. Research on this class of problem is quite limited, and until very recently no non-asymptotic convergence results have been reported. We present two simple stochastic gradient algorithms, for finite-sum and general stochastic optimization problems, which have superior convergence complexities compared to the current state of the art. We also demonstrate our algorithms' better performance in practice for empirical risk minimization on well known datasets.

Shunsuke Kitada / 北田 俊輔 (Hosei Univ.)

Twitter | GitHub | Homepage

法政大学 理工学研究科 応用情報工学専攻 修士課程 2年。 Natural language processing, medical image processing, computational advertisingなど複数分野に渡って研究を進めている。

Title

Conversion Prediction Using Multi-task Conditional Attention Networks to Support the Creation of Effective Ad Creative (KDD2019)

Abstract

Accurately predicting conversions in advertisements is generally a challenging task, because such conversions do not occur frequently. In this paper, we propose a new framework to support creating high-performing ad creatives, including the accurate prediction of ad creative text conversions before delivering to the consumer. The proposed framework includes three key ideas: multi-task learning, conditional attention, and attention highlighting. Multi-task learning is an idea for improving the prediction accuracy of conversion, which predicts clicks and conversions simultaneously, to solve the difficulty of data imbalance. Furthermore, conditional attention focuses attention of each ad creative with the consideration of its genre and target gender, thus improving conversion prediction accuracy. Attention highlighting visualizes important words and/or phrases based on conditional attention. We evaluated the proposed framework with actual delivery history data (14,000 creatives displayed more than a certain number of times from Gunosy Inc.), and confirmed that these ideas improve the prediction performance of conversions, and visualize noteworthy words according to the creatives' attributes.

Presenter 5

(募集中)

Presenter 6

(募集中)

その他

  • 当日の様子は写真を撮ってFacebookなどのSNSに公開を予定しています。 問題がある方は、事前にその旨ご連絡ください。
  • 参加者への営業行為は原則として禁止いたします。

Presenter

Media View all Media

If you add event media, up to 3 items will be shown here.

Feed

nunukim

nunukim さんが書き込みました。

2019/05/22 13:54

@diadochos さん、ご連絡いただきありがとうございます。遅れて参加される旨、承知いたしました、大丈夫でございます。返事が遅くなりすいませんでした。

Takeshi Teshima

Takeshi Teshima さんが書き込みました。

2019/05/13 19:12

前の予定が18時30分まであるため、最大20分ほど遅れてしまう可能性があるのですが、途中参加させて頂くことはできますか。恐れ入りますがよろしくお願い致します。

nunukim

nunukim published ICML/KDD 2019 Pre-conference session.

05/13/2019 18:32

ICML/KDD 2019 Pre-conference session を公開しました!

Ended

2019/06/03(Mon)

19:00
22:00

開催日時が重複しているイベントに申し込んでいる場合、このイベントには申し込むことができません

Registration Period
2019/05/13(Mon) 18:32 〜
2019/06/03(Mon) 22:00

Location

LIVING by LAPRAS

東京都渋谷区円山町28-1(渋谷道玄坂スカイビル2F)

Attendees(39)

PND

PND

ICML/KDD 2019 Pre-conference session に参加を申し込みました!

Takeshi Teshima

Takeshi Teshima

I joined ICML/KDD 2019 Pre-conference session!

KanSAKAMOTO

KanSAKAMOTO

ICML/KDD 2019 Pre-conference session に参加を申し込みました!

m3yrin

m3yrin

ICML/KDD 2019 Pre-conference sessionに参加を申し込みました!

bigsea_t

bigsea_t

I joined ICML/KDD 2019 Pre-conference session!

tnukui

tnukui

ICML/KDD 2019 Pre-conference sessionに参加を申し込みました!

gm3d2

gm3d2

ICML/KDD 2019 Pre-conference session に参加を申し込みました!

fofof

fofof

ICML/KDD 2019 Pre-conference sessionに参加を申し込みました!

yhiss

yhiss

ICML/KDD 2019 Pre-conference sessionに参加を申し込みました!

Han Bao

Han Bao

ICML/KDD 2019 Pre-conference session に参加を申し込みました!

Attendees (39)

Canceled (22)