Commit 464c6f33 by Wenjie Huang

clean up docs

parent 67185ab3
Advanced Data Preprocessing
===========================
.. note::
详细介绍一下StarryGL几种数据管理类,例如GraphData,的使用细节,内部索引结构的设计和底层操作。
\ No newline at end of file
Distributed Partition Parallel Distributed Partition Parallel
============================== ==============================
.. note::
分布式分区并行训练部分
The **distributed partition parallel** refers to the process of partitioning a large graph into The **distributed partition parallel** refers to the process of partitioning a large graph into
multiple partitions and distributing them to different workers. Each worker is responsible for training multiple partitions and distributing them to different workers. Each worker is responsible for training
a specific partition in parallel. The data is exchanged across different partitions. a specific partition in parallel. The data is exchanged across different partitions.
......
Distributed Partition Parallel
==============================
.. note::
分布式分区并行训练部分
\ No newline at end of file
Distributed Sampling Parallel Distributed Sampling Parallel
============================= =============================
.. note::
基于分布式时序图采样的训练模式
.. toctree:: .. toctree::
sampler sampler
features features
......
Distributed Timeline Parallel Distributed Timeline Parallel
============================= =============================
.. note::
分布式时序并行
The timeline parallel means each node in the graph performs independent computation The timeline parallel means each node in the graph performs independent computation
since each node has its own independent timeline. This parallel approach emphasizes that the timelines of each node are since each node has its own independent timeline. This parallel approach emphasizes that the timelines of each node are
independent and may span multiple machines or processes, and nodes may not immediately synchronize or communicate with each other. independent and may span multiple machines or processes, and nodes may not immediately synchronize or communicate with each other.
......
Distributed Timeline Parallel
=============================
.. note::
分布式时序并行
\ No newline at end of file
Distributed Temporal Sampling
=============================
.. note::
基于分布式时序图采样的训练模式
\ No newline at end of file
starrygl.distributed starrygl.distributed
==================== ====================
.. note::
自动生辰的api文档,需要在starrygl源代码目录里添加码内注释
.. currentmodule:: starrygl.distributed .. currentmodule:: starrygl.distributed
.. autosummary:: .. autosummary::
......
...@@ -4,4 +4,3 @@ Get Started ...@@ -4,4 +4,3 @@ Get Started
.. toctree:: .. toctree::
install_guide install_guide
intro_example
\ No newline at end of file
...@@ -9,8 +9,6 @@ StarryGL Documentation ...@@ -9,8 +9,6 @@ StarryGL Documentation
tutorial/index tutorial/index
advanced/index advanced/index
api/python/index api/python/index
cheatsheets/index
external/index
.. Indices and tables .. Indices and tables
.. ================== .. ==================
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment