我的網誌清單

2023年2月13日星期一

Trading Bots

Market data into spreadsheet

https://www.alphavantage.co/spreadsheets/.    data can be shown with flask



GitHub - birdhouses/ai-trader     python + open AI

GitHub - birdhouses/trading212: Get insights in your trading212 dividend payouts




Copilot answers APIdog to scrap the data from Tradingview

To use Apidog for generating a schema from TradingView's screener database, you would need to follow these steps. However, note that scraping TradingView's data directly may violate their terms of service, so it's always better to use their official API if available. Here's a general guide for schema generation using Apidog:


1. Understand the Data Source

  • TradingView does not provide direct access to its database schema. If you're trying to scrape or analyze screener data, you might need to first identify the structure of the data (e.g., JSON responses from their API or web requests).
  • Use tools like browser developer tools (Network tab) to inspect the data format (e.g., JSON or HTML).

2. Use Apidog to Generate Schema

Apidog supports schema generation from JSON or database tables. Here's how you can proceed:

Option 1: From JSON Data

  1. Extract JSON Data:

    • If you have access to TradingView's screener data in JSON format (e.g., from API responses or web scraping), save it as a .json file.
  2. Generate Schema:

    • Open Apidog and navigate to the "Generate Schema" section.
    • Upload the JSON file or paste the JSON content directly.
    • Apidog will analyze the structure and generate a schema automatically.

Option 2: From Database Tables

  1. Connect to Database:

    • If you have access to a database containing TradingView-like screener data, connect Apidog to the database.
    • Provide the necessary credentials (host, port, username, password).
  2. Generate Schema:

    • Apidog will read the table structures and generate schemas for each table.

3. Validate and Customize the Schema

  • Review the generated schema to ensure it matches the data structure.
  • Customize field names, data types, or relationships as needed.

Important Notes

  • Respect Terms of Service: Scraping or unauthorized access to TradingView's data may breach their terms of service. Always prefer using their official API if available.
  • Alternative Tools: If Apidog doesn't meet your needs, consider other schema generation tools like Postman or Swagger.

Let me know if you'd like further clarification! 😊


Python Library "TradingView_screener"

https://shner-elmo.github.io/TradingView-Screener/3.0.0/tradingview_screener.html#quickstart

GitHub:

https://github.com/shner-elmo/TradingView-Screener


It is handy to have an online SQL viewer to look at the contents of the database files.

https://beta.sqliteviewer.app/2025-08-17.db/table/stock_data


Copilot suggested python codes to write a data file to Google sheet.

Here’s a concise guide to storing a file's data into a Google Sheet using Python. This involves enabling the Google Sheets API, authenticating, and writing data to the sheet.

Steps Overview

  1. Enable the Google Sheets API in your Google Cloud Console.
  2. Download the credentials.json file for authentication.
  3. Install required Python libraries: google-authgoogle-auth-oauthlibgoogle-auth-httplib2, and gspread.
  4. Use the following Python code to upload data.

Code Example 1: Writing Data to Google Sheets

import gspread
from google.oauth2.service_account import Credentials

# Authenticate using the credentials.json file
scope = ["https://spreadsheets.google.com/feeds", "https://www.googleapis.com/auth/drive"]
creds = Credentials.from_service_account_file("credentials.json", scopes=scope)
client = gspread.authorize(creds)

# Open the Google Sheet by name
sheet = client.open("Your Google Sheet Name").sheet1

# Example: Writing data to the sheet
data = [["Name", "Age", "City"], ["Alice", 30, "San Francisco"], ["Bob", 25, "New York"]]
for row in data:
    sheet.append_row(row)

print("Data successfully written to Google Sheet!")

Code Example 2: Reading a File and Storing Its Data

import csv
import gspread
from google.oauth2.service_account import Credentials

# Authenticate using the credentials.json file
scope = ["https://spreadsheets.google.com/feeds", "https://www.googleapis.com/auth/drive"]
creds = Credentials.from_service_account_file("credentials.json", scopes=scope)
client = gspread.authorize(creds)

# Open the Google Sheet by name
sheet = client.open("Your Google Sheet Name").sheet1

# Read data from a CSV file and write to the sheet
with open("your_file.csv", "r") as file:
    reader = csv.reader(file)
    for row in reader:
        sheet.append_row(row)

print("File data successfully written to Google Sheet!")

Code Example 3: Writing Data from a Python Dictionary

import gspread
from google.oauth2.service_account import Credentials

# Authenticate using the credentials.json file
scope = ["https://spreadsheets.google.com/feeds", "https://www.googleapis.com/auth/drive"]
creds = Credentials.from_service_account_file("credentials.json", scopes=scope)
client = gspread.authorize(creds)

# Open the Google Sheet by name
sheet = client.open("Your Google Sheet Name").sheet1

# Example: Writing dictionary data to the sheet
data = [{"Name": "Alice", "Age": 30, "City": "San Francisco"},
        {"Name": "Bob", "Age": 25, "City": "New York"}]

# Write headers
sheet.append_row(list(data[0].keys()))

# Write rows
for entry in data:
    sheet.append_row(list(entry.values()))

print("Dictionary data successfully written to Google Sheet!")

Key Notes

  • Replace "Your Google Sheet Name" with the actual name of your Google Sheet.
  • Ensure the credentials.json file is downloaded from your Google Cloud Console and placed in the same directory as your script.
  • Install required libraries using:
    pip install gspread google-auth google-auth-oauthlib google-auth-httplib2
    

Let me know if you need further clarification or help! 😊


Indian guy stockanalysis 

MCP codes only for India market 

https://medium.com/@varungangu1/building-a-stock-analysis-mcp-server-with-docker-and-claude-desktop-eae4963dc3a7

his GitHub depository , not much description over there

https://github.com/varun-g-12/StockAnalysisMCP


Python Library "TradingView_scrapper" live example

Saved 3 files:-  52wks high ,  all time highs, and high gainers in excel formats everyday.

then analyze the patterns, next things I would do is to save the files into my Google sheets.


data scrapped from this TradingView page:

https://www.tradingview.com/markets/stocks-usa/market-movers-52wk-high/


Original

 https://medium.com/@alvinmarco8/otomasi-scrapping-data-market-saham-di-tradingview-menggunakan-python-c499a244db40

my codes:

from urllib.request import urlopen

from bs4 import BeautifulSoup

import pandas as pd

from datetime import date

import os

from urllib.error import HTTPError

def tradingviewScrapper():


    urls = [

        'https://www.tradingview.com/markets/stocks-usa/market-movers-52wk-high/',

        'https://www.tradingview.com/markets/stocks-usa/market-movers-ath/',

        'https://www.tradingview.com/markets/stocks-usa/market-movers-gainers/'

    ]

    root_path = r'/Users/dl/Documents/codes/tradings/'

    print(f"Saving files to: {root_path}")

    os.makedirs(root_path, exist_ok=True)

    for url in urls:

        try:

            # Content retrieval

            page = urlopen(url)

            html = page.read().decode("utf-8")

            # Create BeautifulSoup object

            soup = BeautifulSoup(html, "html.parser")

            rows = soup.find_all('tr')

            parts = url.split('market-movers-')

            typeurl = parts[1]

            today = date.today().strftime('%y-%m-%d')

            data = []

            # Loop through each line and get data from element <td>

            for row in rows:

                # Get all <td> elements in a line

                cells = row.find_all('td')

                # Get the text content from each cell and add it to the data list

                row_data = [cell.text.strip() for cell in cells]

                data.append(row_data)

            th_elements = rows[0].find_all('th')

            headers = [th.text.strip() for th in th_elements]

            # Create DataFrame from data list

            df = pd.DataFrame(data, columns=headers)

            filename = f'Data_{str(typeurl)}_{str(today)}.xlsx'

            excel_path = os.path.join(root_path, filename)

            print(f"Saving Excel to: {excel_path}")          

            # Ensure the parent directory exists

            os.makedirs(os.path.dirname(excel_path), exist_ok=True)

            df.to_excel(excel_path, index=False)

            print(f'Data {url} successfully discraping.')

        except Exception as e:

            print(f"Error saving file: {e}")

tradingviewScrapper()


----------------------------------------------------------------------------------------------------------------------------

TradeBot: Stock Trading using Reinforcement Learning — Part1 | by Shivam Akhauri | ETHER Labs | Medium

沒有留言:

發佈留言