πŸ“¦Data Flow

The data flow in our platform is meticulously designed to ensure the efficient collection, processing, and utilization of data from multiple sources to deliver insightful analyses and summaries to users. Below is a detailed description of each stage in our data flow process.

Data Collection

Our platform collects data from various reputable sources to ensure comprehensive and accurate token analysis. These sources include blockchain scans such as CoinGecko, DefiLlama, DexScreener, and Etherscan, among other third-party services. The data collected can be categorized into two types:

  • Static Data: Includes immutable attributes such as token launch date, token description, GitHub repository, Twitter URL, etc.

  • Dynamic Data: Encompasses frequently changing attributes like current token price, market cap, gas fees, etc.

Data Processing

Data processing in our platform occurs in two major stages:

  1. Data Aggregation Stage:

    • Fetching Data: Upon receiving a request from the frontend with a token symbol or address, the backend fetches data from multiple third-party services. This data includes both static and dynamic attributes of the token.

    • Aggregation and Calculations: The collected data is aggregated and subjected to various calculations to derive new insights. These insights might include liquidity measures, trading volumes, risk indicators, and more.

  2. AI Inference Stage:

    • The processed data is then passed to a third-party AI model, GPT-3.5, which generates a detailed summary and insightful analysis of the data. This stage leverages the advanced capabilities of GPT-3.5 to interpret complex data sets and provide meaningful conclusions and warnings.

Frontend and Backend Data Interaction

The data flow between the frontend and backend is designed to be seamless and efficient:

  • Data Request: When a user navigates to a token analysis page, the frontend sends a request to the backend with the token symbol or address.

  • Data Fetching and Processing: The backend uses this token identifier to fetch relevant data from the aforementioned third-party services. The fetched data is then processed and analyzed as described above.

  • Data Response: The processed data, along with AI-generated insights, is sent back to the frontend. The frontend then renders this data visually using graphs and charts for user consumption.

Data Storage and Retrieval

Data storage and retrieval are handled using PostgreSQL via AWS RDS and IPFS for decentralized storage:

  • User Data Storage: User data is stored when they sign in for the first time. This includes user details and interaction data to understand user needs better.

  • Token Analysis Data Storage: When a user requests a token analysis, the relevant data is updated in the database. This allows for the rendering of on-demand tables and dashboards.

AI Data Analysis

The AI components play a crucial role in our data flow:

  • Data Input to AI Model: Processed token data is fed into the GPT-3.5 AI model.

  • AI Processing: The AI model analyzes the data and generates a detailed summary and insights. This helps users understand complex data sets easily and make informed decisions.

Blockchain Data Integration

Our platform integrates live blockchain data to enhance token analysis and smart contract auditing:

  • User Product: Live blockchain data is used to generate on-the-fly token analyses for users. This ensures that users always have access to the most current and relevant data.

  • B2B and Developer Products: These products focus on auditing smart contracts using AI. The auditing process includes both static and dynamic analysis of smart contracts to identify vulnerabilities and ensure security.

By employing a comprehensive and technically sophisticated data flow architecture, our platform ensures that users receive accurate, timely, and insightful analyses of tokens and smart contracts. This approach leverages the strengths of both AI and blockchain technologies to deliver unparalleled value and security to our users.

Last updated