English  |  正體中文  |  简体中文  |  Post-Print筆數 : 11 |  Items with full text/Total items : 88657/118248 (75%)
Visitors : 23504054      Online Users : 235
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    政大機構典藏 > 理學院 > 資訊科學系 > 學位論文 >  Item 140.119/126582
    Please use this identifier to cite or link to this item: http://nccur.lib.nccu.edu.tw/handle/140.119/126582


    Title: 基於圖形卷積神經網路之異質性圖譜表示法學習
    Heterogeneous Graph Embedding Based on Graph Convolutional Neural Networks
    Authors: 蘇裕勝
    Su, Yu-Sheng
    Contributors: 蔡銘峰
    Tsai, Ming-Feng
    蘇裕勝
    Su, Yu-Sheng
    Keywords: 表示法學習
    圖形卷積神經網
    推薦
    連結預測
    Network Embedding
    GNN
    Link prediction
    Recommendation
    Date: 2019
    Issue Date: 2019-10-03 17:18:08 (UTC+8)
    Abstract: 近年來由於龐大的資料量,如何保存這些資料以及如何將這些資料
    做分析、知識庫管理、推薦等變成非常有挑戰的工作。網路學習表示
    法(Information Network Embedding),能有效將不同節點和關係投射
    到低維度空間,因而成了非常熱門的領域,近年來GNN(Graph Neural
    Network) 的概念也被加入到網路學習表示法領域,應用在分類、
    推薦等工作。本論文提出一個異質網路表示學習法(Heterogeneous Information
    Network Embedding)的架構:先透過學習表示法產生節點的
    表示法當作特徵值,並透過建立同質網路圖以及GraphSAGE 的訓練,
    將我們所需要的節點都投射到同一個空間中,來做連結預測,以及推
    薦。在連結預測中,我們基於我們的建圖方法,可以做到把多種節點
    特徵值嵌在一起,並做訓練,能夠有效的提升連結預測的F1-score 成
    績。在推薦工作中基於我們的建圖方式可以考慮到更多High order 資
    訊,進而提升推薦系統在MAP、Recall、Hit ratio的成績。
    In recent years, information network embedding has become popular because the techniques enable to encode information into low-dimensions representation, even for a graph/network with multiple types of nodes and relations. In addition, graph neural network (GNN) has also shown its effectiveness in learning large-scale node representations on node classification. In this paper, therefore, we propose a framework based on the heterogeneous network embedding and the idea of graph neural network. In our framework, we first generate node representations by various network embedding methods. Then, we split a homogeneous network graph into subgraphs and concatenate the learned node representations into the same embedding space. After that, we apply one of variant GNN, called GraphSAGE, to generate representations for the tasks of link prediction and recommendation. In our experiments, the results on the tasks of link prediction and recommendation both show the effectiveness of the proposed framework.
    Reference: [1] P. W. Battaglia, J. B. Hamrick, V. Bapst, A. Sanchez-Gonzalez, V. F. Zambaldi, M. Malinowski, A. Tacchetti, D. Raposo, A. Santoro, R. Faulkner, C¸ . G¨ulc¸ehre,
    F. Song, A. J. Ballard, J. Gilmer, G. E. Dahl, A. Vaswani, K. Allen, C. Nash, V. Langston, C. Dyer, N. Heess, D. Wierstra, P. Kohli, M. Botvinick, O. Vinyals,
    Y. Li, and R. Pascanu. Relational inductive biases, deep learning, and graph networks. CoRR, abs/1806.01261, 2018.
    [2] R. Burke. Hybrid recommender systems: Survey and experiments. User Modeling and User-Adapted Interaction, 12(4):331–370, Nov 2002.
    [3] Y. Dong, N. V. Chawla, and A. Swami. metapath2vec: Scalable representation learning for heterogeneous networks. In KDD ’17, pages 135–144. ACM, 2017.
    [4] A. Grover and J. Leskovec. Node2vec: Scalable feature learning for networks. In Proceedings of the 22Nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’16, pages 855–864, New York, NY, USA, 2016. ACM.
    [5] W. L. Hamilton, R. Ying, and J. Leskovec. Inductive representation learning on large graphs. In NIPS, 2017.
    [6] G. E. Hinton. Learning distributed representations of concepts. In Proceedings of the eighth annual conference of the cognitive science society, volume 1, page 12. Amherst, MA, 1986.
    [7] T. N. Kipf and M. Welling. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.
    [8] T. Mikolov, I. Sutskever, K. Chen, G. Corrado, and J. Dean. Distributed representations of words and phrases and their compositionality. In Proceedings of the 26th
    International Conference on Neural Information Processing Systems - Volume 2, NIPS’13, pages 3111–3119, USA, 2013. Curran Associates Inc.
    [9] M. J. Pazzani and D. Billsus. The adaptive web. chapter Content-based Recommendation Systems, pages 325–341. Springer-Verlag, Berlin, Heidelberg, 2007.
    [10] B. Perozzi, R. Al-Rfou, and S. Skiena. Deepwalk: Online learning of social representations.
    In Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’14, pages 701–710, New York, NY, USA, 2014. ACM.
    [11] J. Qiu, Y. Dong, H. Ma, J. Li, K. Wang, and J. Tang. Network embedding as matrix factorization: Unifying deepwalk, line, pte, and node2vec. In Proceedings of the
    Eleventh ACM International Conference on Web Search and Data Mining, WSDM ’18, pages 459–467, New York, NY, USA, 2018. ACM.
    [12] P. Resnick, N. Iacovou, M. Suchak, P. Bergstrom, and J. Riedl. Grouplens: An open architecture for collaborative filtering of netnews. In Proceedings of the 1994 ACM
    Conference on Computer Supported Cooperative Work, CSCW ’94, pages 175–186, New York, NY, USA, 1994. ACM.
    [13] B. Sarwar, G. Karypis, J. Konstan, and J. Riedl. Item-based collaborative filtering recommendation algorithms. In Proceedings of the 10th International Conference on World Wide Web,WWW’01, pages 285–295, New York, NY, USA, 2001. ACM.
    [14] C. Shi and P. S. Yu. Heterogeneous Information Network Analysis and Applications. Springer Publishing Company, Incorporated, 1st edition, 2017.
    [15] J. Tang, M. Qu, M. Wang, M. Zhang, J. Yan, and Q. Mei. Line: Large-scale information network embedding. In Proceedings of the 24th International Conference on
    World Wide Web, WWW ’15, pages 1067–1077, Republic and Canton of Geneva, Switzerland, 2015. InternationalWorldWideWeb Conferences Steering Committee.
    [16] P. Veliˇckovi´c, G. Cucurull, A. Casanova, A. Romero, P. Li`o, and Y. Bengio. Graph Attention Networks. International Conference on Learning Representations, 2018.
    accepted as poster.
    [17] K. Xu, W. Hu, J. Leskovec, and S. Jegelka. How powerful are graph neural networks? In International Conference on Learning Representations, 2019.
    [18] J. Zhou, G. Cui, Z. Zhang, C. Yang, Z. Liu, and M. Sun. Graph neural networks: A review of methods and applications. CoRR, abs/1812.08434, 2018.
    Description: 碩士
    國立政治大學
    資訊科學系
    106753004
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0106753004
    Data Type: thesis
    DOI: 10.6814/NCCU201901186
    Appears in Collections:[資訊科學系] 學位論文

    Files in This Item:

    File SizeFormat
    300401.pdf1310KbAdobe PDF0View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback