2012年9月30日 星期日
龜山島。401高地。環島。
時間:2012-09-22
很早之前,林教授發信寄給宗偉,華倚,思賢,小根,幫協會出一個活動。
四個人思考許久,決定走輕鬆路線,龜山島401高地。
開活動很簡單,辦活動好麻煩,必須處理許多事項,還好我們有四個人,工作分配下去,倒也輕鬆如意。小根負責保險。其實每個人都應該保意外險,例如同事B,每年都會保一次。這次共有九員參加,還包含大咖大哥哩。
話說上回爬山拉傷鵝足肌,有點不想去龜山島,打算專心養傷。但,401高地不好申請,而且才爬一點點應該沒差吧!走上401高地必須陡上1706階樓梯,第一千階有休息處,我為自己定了一個目標,好走歹走也要屏住一口氣走到第一千階 ((志氣變小,但好漢不提當年勇?!過去的就過去了,只有當下的我才是真實的))。或許太久沒爬山了,有點撐,但雙腳並無不適,島上悶熱的空氣,逼出許多汗滴,身體應能負擔,只要心理有跟上就夠了。
((摸之前告誡不舒服就別上去,哈,明知山有虎,偏往虎山行))
大咖拼第一位搶拍照,思賢,Ben Lo,跟我搶第二位,咚咚咚一下就搶到第一千階,三人賴在涼亭等後面隊友,後面隊友果然夠爭氣,讓我們休夠久,孱弱的島上海風,倒也吹的我昏昏欲睡。最後一段只剩七百多階,撐上去就能看見龜山島全貌,好興奮。下山倒吃不少苦頭,因為害怕再次拉傷鵝足肌,大腿肌撐得很辛苦,下山略有鐵腿。
復出第一站,成功吧?
PS. 美中不足處 Ben Lo 被虎頭蜂叮到。
2012年9月23日 星期日
海力士 (日式料理)
2012-09-23
自以為豚馬最好吃,至今仍這麼認為。
但實在太遠,有時懶惰就不想去,因此有必要尋找永和還不錯的店。在此推薦「海力士」。
今天試過綜合生魚片(小)(NT150)、鰻魚飯(NT270),還不錯,吃的很開心,若要挑缺點當然是不夠精緻,服務也不算特佳,這方面很嚴格的。
雖然如此,還是得面對交通不便的問題,有得吃就吃吧!
自以為豚馬最好吃,至今仍這麼認為。
但實在太遠,有時懶惰就不想去,因此有必要尋找永和還不錯的店。在此推薦「海力士」。
海力士日式料理
地址:新北市永和區福和路127號
電話:(02)3233-8488
今天試過綜合生魚片(小)(NT150)、鰻魚飯(NT270),還不錯,吃的很開心,若要挑缺點當然是不夠精緻,服務也不算特佳,這方面很嚴格的。
雖然如此,還是得面對交通不便的問題,有得吃就吃吧!
2012年9月17日 星期一
2012年9月16日 星期日
[Python] webbrowser example
import sys
import webbrowser
def main():
webbrowser.open_new_tab('http://www.google.com/')
webbrowser.open_new_tab('http://www.yahoo.com/')
if __name__ == '__main__':
sys.exit(main())
HTTP Status Code Definitions
http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html
200 OK
5xx Server Error
Response status codes beginning with the digit "5" indicate cases in which the server is aware that it has erred or is incapable of performing the request. Except when responding to a HEAD request, the server SHOULD include an entity containing an explanation of the error situation, and whether it is a temporary or permanent condition. User agents SHOULD display any included entity to the user. These response codes are applicable to any request method.
200 OK
Standard response for successful HTTP requests. The actual response will depend on the request method used. In a GET request, the response will contain an entity corresponding to the requested resource. In a POST request the response will contain an entity describing or containing the result of the action.
解釋:
--2012-09-16 22:15:02-- http://mops.twse.com.tw/mops/web/...
Resolving mops.twse.com.tw... 210.69.240.131
Connecting to mops.twse.com.tw|210.69.240.131|:80... connected.
HTTP request sent, awaiting response... 200 O.K
Length: unspecified [text/html]
Saving to: `...'
5xx Server Error
Response status codes beginning with the digit "5" indicate cases in which the server is aware that it has erred or is incapable of performing the request. Except when responding to a HEAD request, the server SHOULD include an entity containing an explanation of the error situation, and whether it is a temporary or permanent condition. User agents SHOULD display any included entity to the user. These response codes are applicable to any request method.
解釋:之前碰到,衰。還有一種錯誤也很麻煩,response timeout,連 HTTP status code 都不給你。
2012年9月15日 星期六
金價漲1.9% 上看1800美元
http://www.appledaily.com.tw/appledaily/article/headline/20120915/34510651
金價漲1.9% 上看1800美元
散戶湧拋售潮「真正大戶還沒出手」
2012年09月15日
金價昨創波段新高,有機會衝高至每盎司1800美元。林琨凱攝
【王立德、張琬聆、吳苡辰╱台北報導】受美國實施QE3的激勵,國際金價昨天飆破每盎司1770美元,國內銀樓逐漸出現散戶售金潮。
擴散效應可期
北市金銀珠寶公會理事長李文欽指出,真正擁有大筆實體黃金部位的大戶還沒出手,預料大戶出售目標價直指每盎司1800美元。
國內銀樓昨飾金價格大漲,每台錢賣出價6540元台幣,較前日上漲120元,漲幅1.9%。而台銀黃金存摺牌告賣出價也來到每公克1681元,黃金條塊每公斤168.8萬元。
寶來黃金期信基金經理人方立寬說,量化寬鬆政策如同油門,金價如同汽車,油門踩下後,汽車除了會衝快,還會持續滑行,形同量化寬鬆政策宣布後,金價會大漲,後續還有擴散效應可期。
改加碼其他外幣
台灣銀行貴金屬部副理楊天立則認為,金價中長線仍看多,但能否站穩每盎司1800美元是觀察指標。若能站穩,將有機會挑戰1812~1850美元價位,若出現短線獲利回吐賣壓,金價可能下測至1730~1750美元價位。
除看好黃金走勢外,在外幣投資方面,銀行業也提到,一般民眾此時可適度減少美元,改加碼投資澳幣、紐幣,如果風險忍受度較強的投資人,也可逢低買進。另外人民幣也將有一波升值空間。
至於房地產的投資布局,華固建設總經理洪嘉昇表示,原本QE3的推行會讓民眾去追逐保值型資產,但近期央行對貸款利率與建商土建融資的嚴厲控管,本應對房地產幫助很大的QE3資金,現在影響反而不大。
資金未必進房產
興富發建設發言人廖昭雄直言,若說將有通膨疑慮造成民眾搶購房產也不至於,因為「央行不是白癡,9A總裁彭淮南肯定會先做好調整控制,不會讓通膨發生。」屏東商技學院不動產經營系助理教授楊宗憲也透露,從之前QE1、QE2的歷史看來,美國印鈔票救市頂多讓台灣股市連動,浮濫資金未必流進房地產。
黃金投資 建議
投資人類型╱操作建議
●空手投資人
先建立部分黃金部位,跌時買進,逢高分批賣出。
●已有黃金部位投資人
逢高賣出,金價每盎司1850美元以上時先出清,再依國際金融局勢變化,調整投資策略。
●保守型投資人
黃金回跌至1750美元以下時分批買進,1800美元以上出清。
●積極型投資人
分批買進,金價每盎司突破1820美元時,依國際金融局勢變化,調整投資策略。
資料來源:各投信、銀行
評:請參考 http://www.kitco.com/charts/historicalgold.html 查詢歷史價格,區間看長一點,1975 - 2012,站在高崗上,不死也難。當然還得看 QE3 的反應,但我想這跟美國選舉脫不了關係,總得炒熱一下場子。
2012年9月14日 星期五
[Python] Sanity Test on Cash Flow Statement
import logging
import os
import sys
import logger
class SanityCheckerTxt():
def __init__(self):
self.__logger = logging.getLogger()
self.white_msg = [
'資料庫中查無需求資料',
'無應編製合併財報之子公司',
'外國發行人免申報個別財務報表資訊,請至合併財務報表查詢',
]
self.check_list_prefixes = {
'operating_activity' : [
'營業活動',
],
'investing_activity' : [
'投資活動',
],
'financing_activity' : [
'融資活動',
'理財活動',
'不影響現金流量之融資活動'
],
'net_income' : [
'本年度淨利',
'本期母公司淨利',
'本期合併淨利',
'本期合併淨損',
'本期合併總利益',
'本期合併總損益',
'本期純利',
'本期純益',
'本期純損',
'本期純(損)益',
'本期淨利',
'本期淨純益',
'本期淨益',
'本期淨損',
'本期淨(損)',
'本期稅後純益',
'本期稅後淨利',
'本期損益',
'本期總損益',
'本期(損)純益',
'合併(損)',
'合併純益',
'合併純(損)益',
'合併淨利',
'合併淨損益',
'合併淨損',
'合併總利益',
'合併總純益',
'合併總純損',
'合併總純(損)益',
'合併總淨利',
'合併總淨損',
'合併總損失',
'合併總損益',
'合併總(損)益',
'純益',
'純損',
'純(損)益',
'淨利',
'淨損',
'歸屬於母公司股東之合併純益',
'歸屬於母公司股東之合併淨利',
'歸屬於母公司股東之純益',
'歸屬於母公司股東之淨利',
'繼續營業單位稅後淨利',
'A10000', # t05st36_2384_99_02 and t05st36_2384_99_03
]
}
def check_batch(self, src_dir):
self.__logger.debug('''Check directory: %s''' % src_dir)
for file in os.listdir(src_dir):
self.check(os.path.join(src_dir, file))
#return # for debugging
def check(self, file):
self.__logger.debug(file)
assert os.path.isfile(file)
fd = open(file, 'rb')
content = fd.read()
fd.close()
lines = content.decode('utf-8').split('\n')
# No record - Test if white message
if len(lines) is 1:
msg = lines[0]
if not msg in self.white_msg:
self.__logger.info('''%s => %s''' % (file, msg))
return
# Has record - Test if startswith certain prefixes
check_list = {
'operating_activity' : False,
'investing_activity' : False,
'financing_activity' : False,
'net_income' : False,
}
for line in lines:
normed_line = ''.join(line.strip().split())
normed_line = normed_line.replace('│', '').replace('予', '於')
normed_line = normed_line.replace('(', '(').replace(')', ')')
for key in check_list:
for prefix in self.check_list_prefixes[key]:
if normed_line.startswith(prefix):
check_list[key] = True
break
for key in check_list:
if not check_list[key]:
self.__logger.info('''%s => No %s''' % (file, key))
self.__logger.debug('\n'.join(lines))
def main():
logger.config_root(level=logging.INFO)
c = SanityCheckerTxt()
c.check_batch('./txt/2498');
if __name__ == '__main__':
sys.exit(main())
2012年9月11日 星期二
[C#] 故意鎖住檔案
C# 功力尚未退步,可喜可賀。
using System;
using System.IO;
namespace FileLocker
{
class Program
{
static void Main(string[] args)
{
if (args.Length != 1)
{
Console.WriteLine("Usage: FileLocker.exe file");
return;
}
String file = args[0];
if (!File.Exists(file))
{
Console.WriteLine("File doesn't exist: {0}", file);
return;
}
LockFile(file);
}
static void LockFile(String file)
{
using (File.Open(file, FileMode.Open, FileAccess.Read, FileShare.None))
{
Console.WriteLine("Press any key to unlock file...");
Console.ReadKey();
}
}
}
}
外國發行人免申報個別財務報表資訊,請至合併財務報表查詢
最近碰到的新招:
外國發行人免申報個別財務報表資訊,請至合併財務報表查詢
還碰到奇奇怪怪的東西。
強烈懷疑現金流量表是 txt 直出塞 database。
這樣也好,表示系統商不夠奸詐,太奸詐小根就不好爬網頁了。
這樣也好,表示系統商不夠奸詐,太奸詐小根就不好爬網頁了。
2012年9月6日 星期四
還是很髒很髒的 Python script
有趣的結果:
{'StockSymbol': '1414', 'K': '39.20', 'ROE': '24.63', 'ClosePrice': '9.03', 'BookValue': '14.37'}
{'StockSymbol': '1416', 'K': '16.84', 'ROE': '14.78', 'ClosePrice': '15.55', 'BookValue': '17.72'}
{'StockSymbol': '1460', 'K': '22.16', 'ROE': '18.60', 'ClosePrice': '8.20', 'BookValue': '9.77'}
{'StockSymbol': '1808', 'K': '22.82', 'ROE': '49.08', 'ClosePrice': '39.20', 'BookValue': '18.23'}
{'StockSymbol': '2498', 'K': '18.72', 'ROE': '52.13', 'ClosePrice': '254.00', 'BookValue': '91.21'}
{'StockSymbol': '2501', 'K': '19.71', 'ROE': '19.96', 'ClosePrice': '12.85', 'BookValue': '12.69'}
{'StockSymbol': '2611', 'K': '42.12', 'ROE': '41.27', 'ClosePrice': '15.60', 'BookValue': '15.92'}
{'StockSymbol': '2820', 'K': '33.23', 'ROE': '23.19', 'ClosePrice': '10.65', 'BookValue': '15.26'}
{'StockSymbol': '2888', 'K': '15.08', 'ROE': '14.66', 'ClosePrice': '8.08', 'BookValue': '8.31'}
{'StockSymbol': '3052', 'K': '15.38', 'ROE': '12.80', 'ClosePrice': '10.50', 'BookValue': '12.62'}
{'StockSymbol': '3056', 'K': '23.32', 'ROE': '38.19', 'ClosePrice': '25.50', 'BookValue': '15.57'}
{'StockSymbol': '4306', 'K': '16.36', 'ROE': '21.32', 'ClosePrice': '23.45', 'BookValue': '17.99'}
{'StockSymbol': '5508', 'K': '20.08', 'ROE': '46.17', 'ClosePrice': '55.70', 'BookValue': '24.23'}
{'StockSymbol': '5511', 'K': '21.08', 'ROE': '34.90', 'ClosePrice': '31.85', 'BookValue': '19.24'}
{'StockSymbol': '5515', 'K': '17.08', 'ROE': '16.78', 'ClosePrice': '15.00', 'BookValue': '15.27'}
{'StockSymbol': '6163', 'K': '16.69', 'ROE': '22.44', 'ClosePrice': '19.75', 'BookValue': '14.69'}
{'StockSymbol': '9906', 'K': '55.39', 'ROE': '104.32', 'ClosePrice': '46.20', 'BookValue': '24.53'}
有時間再來分析。
Python script:
import httplib2
import logging
import os
import re
import sys
class KTaiwanBankCrawler():
def __init__(self, src_filepath):
self._logger = logging.getLogger()
self._src_filepath = src_filepath
def Crawl(self):
rv = []
self._Phase0(rv)
self._Phase1(rv)
self._Phase2(rv)
return rv
def _Phase0(self, rv):
"""
Crawl stock symbol, ROE, and close price from a prepared source file.
We need to copy the whole content from http://fund.bot.com.tw/z/index.htm
to a single source file.
"""
if not os.path.isfile(self._src_filepath): return None
for line in open(self._src_filepath):
line.strip()
t = line.split()
if len(t) is 7:
stock_symbol = t[0][:4]
if stock_symbol.isdigit():
info = { 'StockSymbol' : stock_symbol, 'ROE' : t[4], 'ClosePrice': t[1] }
rv.append(info)
def _Phase1(self, rv):
for info in rv:
info = self._Phase1OneStock(info)
def _Phase1OneStock(self, info):
"""
Crawl book value from http://tw.stock.yahoo.com
"""
bv = ''
try:
h = httplib2.Http()
url = '''http://tw.stock.yahoo.com/d/s/company_%s.html''' % info['StockSymbol']
resp, content = h.request(url)
p = '每股淨值: \u3000\u3000'
m = re.compile('.*' + p + '.*')
lines = content.splitlines()
for i in range(len(lines)):
str_line = lines[i].decode('big5')
if m.match(str_line):
begin_index = str_line.index(p) + len(p)
end_index = str_line.index('元', begin_index)
bv = str_line[begin_index:end_index]
except: pass
info['BookValue'] = bv
return info
def _Phase2(self, rv):
for info in rv:
try:
ROE = float(info['ROE'])
P = float(info['ClosePrice'])
B = float(info['BookValue'])
info['K'] = '''%.2f''' % ( ROE * B / P )
except:
info['K'] = None
def main():
import argparse
import base.logger
base.logger.config_root(level=logging.DEBUG)
parser = argparse.ArgumentParser(description='K Analyzer')
parser.add_argument('src', help='set stock filepath')
args = parser.parse_args()
c = KTaiwanBankCrawler(args.src)
for rv in c.Crawl(): print(rv)
if __name__ == '__main__':
sys.exit(main())
2012年9月3日 星期一
用 Python 寫很髒的程式
k_web_crawler.py
import httplib2
import logging
import os
import re
import sys
import base.logger
import default_stock_list
class KWebCrawler():
def __init__(self, stock_symbols):
self._logger = logging.getLogger()
self._stock_symbols = stock_symbols
def Crawl(self):
return [ self._CrawlStock(_) for _ in self._stock_symbols ]
def _CrawlStock(self, stock_symbol):
rv = {}
rv['StockSymbol'] = stock_symbol
rv['ClosePrice'] = self._CrawlClosePrice(stock_symbol)
company = self._CrawlCompany(stock_symbol)
rv['ROE'] = company['ROE']
rv['BookValue'] = company['BookValue']
rv['K'] = self._GetK(rv)
self._logger.debug(rv)
return rv
def _CrawlCompany(self, stock_symbol):
company = { 'BookValue' : None, 'ROE' : None }
try:
h = httplib2.Http()
url = '''http://tw.stock.yahoo.com/d/s/company_%s.html''' % stock_symbol
resp, content = h.request(url)
p_book_value = '每股淨值: \u3000\u3000'
m_book_value = re.compile('.*' + p_book_value + '.*')
m_roe = re.compile('.*股東權益報酬率.*')
lines = content.splitlines()
for i in range(len(lines)):
str_line = lines[i].decode('big5')
if m_book_value.match(str_line):
begin_index = str_line.index(p_book_value) + len(p_book_value)
end_index = str_line.index('元', begin_index)
company['BookValue'] = str_line[begin_index:end_index]
if m_roe.match(str_line):
next_str_line = lines[i+1].decode('big5')
begin_index = next_str_line.index('>') + 1
end_index = next_str_line.index('%<', begin_index)
company['ROE'] = next_str_line[begin_index:end_index]
except: pass
return company
def _CrawlClosePrice(self, stock_symbol):
try:
h = httplib2.Http()
url = '''http://tw.stock.yahoo.com/q/ts?s=%s''' % stock_symbol
resp, content = h.request(url)
pattern = '<table border="0" cellpadding="4" cellspacing="1" width="100%">'
m = re.compile('.*' + pattern + '.*')
for line in content.splitlines():
str_line = line.decode('big5')
if m.match(str_line):
begin_index = [p.start() for p in re.finditer('>', str_line)][39] + 1
end_index = str_line.index('<', begin_index)
return str_line[begin_index: end_index]
except:
return None
def _GetK(self, basic_info):
try:
ROE = float(basic_info['ROE']) * 4
P = float(basic_info['ClosePrice'])
B = float(basic_info['BookValue'])
return '''%.2f''' % ( ROE * B / P )
except:
return None
def main():
base.logger.config_root(level=logging.INFO)
crawl_msci_taiwan()
crawl_taiwan_50()
crawl_taiwan_100()
crawl_taiwan_titc()
crawl_taiwan_e()
crawl_taiwan_divid()
def crawl_msci_taiwan():
stock_list = default_stock_list.DefaultStockList()
crawler = KWebCrawler(stock_list.GetMsciTaiwanList())
for rv in crawler.Crawl():
print(rv)
def crawl_taiwan_50():
stock_list = default_stock_list.DefaultStockList()
crawler = KWebCrawler(stock_list.GetTaiwan50List())
for rv in crawler.Crawl():
print(rv)
def crawl_taiwan_100():
stock_list = default_stock_list.DefaultStockList()
crawler = KWebCrawler(stock_list.GetTaiwan100List())
for rv in crawler.Crawl():
print(rv)
def crawl_taiwan_100():
stock_list = default_stock_list.DefaultStockList()
crawler = KWebCrawler(stock_list.GetTaiwan100List())
for rv in crawler.Crawl():
print(rv)
def crawl_taiwan_titc():
stock_list = default_stock_list.DefaultStockList()
crawler = KWebCrawler(stock_list.GetTaiwanTitcList())
for rv in crawler.Crawl():
print(rv)
def crawl_taiwan_e():
stock_list = default_stock_list.DefaultStockList()
crawler = KWebCrawler(stock_list.GetTaiwanEList())
for rv in crawler.Crawl():
print(rv)
def crawl_taiwan_divid():
stock_list = default_stock_list.DefaultStockList()
crawler = KWebCrawler(stock_list.GetTaiwanDividList())
for rv in crawler.Crawl():
print(rv)
if __name__ == '__main__':
sys.exit(main())
default_stock_list.py
import httplib2
import logging
import os
import re
import sys
import base.logger
import default_stock_list
class KWebCrawler():
def __init__(self, stock_symbols):
self._logger = logging.getLogger()
self._stock_symbols = stock_symbols
def Crawl(self):
return [ self._CrawlStock(_) for _ in self._stock_symbols ]
def _CrawlStock(self, stock_symbol):
rv = {}
rv['StockSymbol'] = stock_symbol
rv['ClosePrice'] = self._CrawlClosePrice(stock_symbol)
company = self._CrawlCompany(stock_symbol)
rv['ROE'] = company['ROE']
rv['BookValue'] = company['BookValue']
rv['K'] = self._GetK(rv)
self._logger.debug(rv)
return rv
def _CrawlCompany(self, stock_symbol):
company = { 'BookValue' : None, 'ROE' : None }
try:
h = httplib2.Http()
url = '''http://tw.stock.yahoo.com/d/s/company_%s.html''' % stock_symbol
resp, content = h.request(url)
p_book_value = '每股淨值: \u3000\u3000'
m_book_value = re.compile('.*' + p_book_value + '.*')
m_roe = re.compile('.*股東權益報酬率.*')
lines = content.splitlines()
for i in range(len(lines)):
str_line = lines[i].decode('big5')
if m_book_value.match(str_line):
begin_index = str_line.index(p_book_value) + len(p_book_value)
end_index = str_line.index('元', begin_index)
company['BookValue'] = str_line[begin_index:end_index]
if m_roe.match(str_line):
next_str_line = lines[i+1].decode('big5')
begin_index = next_str_line.index('>') + 1
end_index = next_str_line.index('%<', begin_index)
company['ROE'] = next_str_line[begin_index:end_index]
except: pass
return company
def _CrawlClosePrice(self, stock_symbol):
try:
h = httplib2.Http()
url = '''http://tw.stock.yahoo.com/q/ts?s=%s''' % stock_symbol
resp, content = h.request(url)
pattern = '<table border="0" cellpadding="4" cellspacing="1" width="100%">'
m = re.compile('.*' + pattern + '.*')
for line in content.splitlines():
str_line = line.decode('big5')
if m.match(str_line):
begin_index = [p.start() for p in re.finditer('>', str_line)][39] + 1
end_index = str_line.index('<', begin_index)
return str_line[begin_index: end_index]
except:
return None
def _GetK(self, basic_info):
try:
ROE = float(basic_info['ROE']) * 4
P = float(basic_info['ClosePrice'])
B = float(basic_info['BookValue'])
return '''%.2f''' % ( ROE * B / P )
except:
return None
def main():
base.logger.config_root(level=logging.INFO)
crawl_msci_taiwan()
crawl_taiwan_50()
crawl_taiwan_100()
crawl_taiwan_titc()
crawl_taiwan_e()
crawl_taiwan_divid()
def crawl_msci_taiwan():
stock_list = default_stock_list.DefaultStockList()
crawler = KWebCrawler(stock_list.GetMsciTaiwanList())
for rv in crawler.Crawl():
print(rv)
def crawl_taiwan_50():
stock_list = default_stock_list.DefaultStockList()
crawler = KWebCrawler(stock_list.GetTaiwan50List())
for rv in crawler.Crawl():
print(rv)
def crawl_taiwan_100():
stock_list = default_stock_list.DefaultStockList()
crawler = KWebCrawler(stock_list.GetTaiwan100List())
for rv in crawler.Crawl():
print(rv)
def crawl_taiwan_100():
stock_list = default_stock_list.DefaultStockList()
crawler = KWebCrawler(stock_list.GetTaiwan100List())
for rv in crawler.Crawl():
print(rv)
def crawl_taiwan_titc():
stock_list = default_stock_list.DefaultStockList()
crawler = KWebCrawler(stock_list.GetTaiwanTitcList())
for rv in crawler.Crawl():
print(rv)
def crawl_taiwan_e():
stock_list = default_stock_list.DefaultStockList()
crawler = KWebCrawler(stock_list.GetTaiwanEList())
for rv in crawler.Crawl():
print(rv)
def crawl_taiwan_divid():
stock_list = default_stock_list.DefaultStockList()
crawler = KWebCrawler(stock_list.GetTaiwanDividList())
for rv in crawler.Crawl():
print(rv)
if __name__ == '__main__':
sys.exit(main())
default_stock_list.py
import logging
import os
import sys
import base.logger
class DefaultStockList():
def __init__(self):
self._logger = logging.getLogger()
def GetMsciTaiwanList(self):
"""
Source: http://www.msci.com/eqb/custom_indices/tw_performance.html
"""
return [
'2330', '2317', '2454', '2412', '1301', '2002', '1303', '1326', '2357', '2882',
'1216', '2382', '2498', '3045', '2308', '2891', '2886', '2881', '2303', '2311',
'2105', '4904', '2892', '2885', '2324', '1101', '2325', '6505', '1402', '2912',
'2883', '2354', '2474', '2347', '2880', '3673', '5880', '2887', '2301', '2890',
'3231', '2409', '2801', '1102', '2353', '2884', '9904', '3008', '4938', '6176',
'1722', '2207', '3697', '3702', '3034', '6121', '3481', '2888', '2823', '2201',
'2448', '9921', '3037', '6239', '2049', '2103', '1504', '1314', '9933', '1605',
'2610', '2385', '9945', '2395', '2337', '1802', '8299', '2903', '3044', '2618',
'2915', '2603', '2384', '2834', '2379', '8069', '6286', '3189', '2392', '2356',
'1434', '1227', '2101', '2006', '2542', '2606', '2015', '5522', '1704', '2362',
'1717', '2615', '2609', '6005', '5483', '4958', '2393', '2204', '2451', '1590',
'2707', '3474', '8046', '6244'
]
def GetTaiwan50List(self):
"""
http://www.twse.com.tw/ch/trading/indices/twco/tai50i.php
"""
return [
'3673', '3697', '3481', '2330', '2303', '2882', '2357', '1303', '2883', '1301',
'2002', '2311', '2317', '1402', '2324', '2892', '2880', '2801', '1216', '1101',
'1102', '2201', '2382', '2308', '1326', '2886', '2891', '2325', '2353', '1722',
'2105', '2412', '2409', '2207', '2301', '2912', '2354', '2347', '2474', '3045',
'2454', '2881', '4904', '2885', '3008', '2498', '2890', '3231', '6505', '5880'
]
def GetTaiwan100List(self):
"""
http://www.twse.com.tw/ch/trading/indices/tmcc/tai100i.php
"""
return [
'3406', '3474', '8046', '2049', '4938', '8422', '1789', '2723', '1590', '2727',
'5871', '4958', '4725', '3149', '2344', '2371', '2356', '2888', '2609', '2337',
'2388', '1504', '1507', '2603', '2501', '2610', '2204', '1605', '1314', '1434',
'2809', '2812', '1440', '1907', '1802', '2379', '2393', '2834', '2707', '1723',
'3702', '2384', '2385', '2542', '2545', '2392', '2823', '2395', '2362', '2360',
'9933', '1717', '2607', '2845', '2903', '2015', '9921', '2504', '2106', '1704',
'9914', '2101', '1710', '9904', '2511', '9945', '2915', '1227', '2103', '9917',
'9907', '1319', '2006', '2606', '1304', '2615', '2327', '3037', '2855', '6005',
'2618', '9940', '5522', '2548', '3034', '8008', '2887', '3044', '2451', '2448',
'2450', '6176', '2884', '2889', '6239', '6285', '6269', '6286', '8078', '3189'
]
def GetTaiwanTitcList(self):
"""
http://www.twse.com.tw/ch/trading/indices/titc/taititc.php
"""
return [
'3474', '4938', '3697', '3481', '2330', '2303', '2357', '2344', '2311', '2324',
'2371', '2356', '2382', '2337', '2388', '2325', '2353', '2379', '3702', '2385',
'2395', '2362', '2409', '2347', '3034', '2454', '8008', '2451', '2448', '2450',
'2498', '6239', '3231', '6285', '6286', '8078', '3189'
]
def GetTaiwanEList(self):
"""
http://www.twse.com.tw/ch/trading/indices/twei/taiei.php
"""
return [
'3406', '8046', '2049', '8422', '1789', '3673', '2723', '1590', '2727', '4958',
'4725', '3149', '1303', '1301', '2002', '2317', '1402', '1216', '1101', '1102',
'2201', '2609', '2308', '1326', '1504', '1507', '2603', '2610', '2204', '1605',
'1314', '1434', '1440', '1907', '1802', '2393', '2707', '1722', '1723', '2384',
'2545', '2392', '2105', '2360', '9933', '2412', '1717', '2607', '2903', '2015',
'9921', '2504', '2207', '2106', '1704', '2301', '9914', '2101', '1710', '9904',
'2912', '9945', '2915', '2354', '1227', '2103', '9917', '9907', '1319', '2006',
'2606', '1304', '2615', '2327', '3037', '2618', '2474', '3045', '3044', '4904',
'3008', '6176', '6269', '6505'
]
def GetTaiwanDividList(self):
"""
http://www.twse.com.tw/ch/trading/indices/twdp/taidividi.php
"""
return [
'8422', '4725', '2303', '1303', '1301', '2324', '1101', '2382', '1326', '2886',
'2325', '2379', '1723', '2385', '2542', '2412', '2301', '1710', '2103', '2606',
'1304', '5522', '2548', '3045', '3034', '2454', '2451', '4904', '3231', '6285'
]
def main():
base.logger.config_root(level=logging.DEBUG)
stock_list = DefaultStockList()
print(stock_list.GetMsciTaiwanList())
print(stock_list.GetTaiwan50List())
print(stock_list.GetTaiwan100List())
print(stock_list.GetTaiwanTitcList())
print(stock_list.GetTaiwanEList())
print(stock_list.GetTaiwanDividList())
if __name__ == '__main__':
sys.exit(main())
訂閱:
文章 (Atom)