摘要:本文通过华为云日志监控服务LTS,捕捉均衡负载ELB后台高可用服务器日志,通过设定LTS触发器,Serverless函数流FunctionGraph服务实时获取ELB日志,写入CSS(Cloud Search Server)服务,最后通过Kibana做数据可视化。
Step1:创建两台ECS服务器,并安装NGINX服务
1.如何搭建在华为云ECS搭建NGINX服务,请见OBS NGINX反向代理实现HTTPS自定义域名访问;
2.通过制作镜像IMS,另起一台服务器,如业务量高峰可通过弹性扩容服务AS设定规则,根据业务需要添加或删减服务器;
3.两台服务器设定开机并启动NGINX服务。
Step2:申请ELB服务,并配置监听和后台服务器
1.创建ELB(确保与ECS在同一VPC内)并申请公网IP
2.添加监听器HTTP/HTTPS协议
3.添加后台服务器监听端口80,选择加权轮回算法
Step3:配置LTS服务,捕捉ELB日志(目前仅支持HTTP/HTTPS协议日志)
Step4:创建函数流FunctionGraph服务,添加配置、编写代码
1.申请并创建FunctionGraph服务,本实验选择Python3.6运行环境
2.通过OBS URL添加Python代码依赖的包(需提前上传代码到OBS服务)
3.程序代码中引入Python程序依赖包
4.设定FunctionGraph代码运行配置
5.添加LTS触发器
6.项目源代码
# -*- coding:utf-8 -*-
import json
import base64
import sys
import os
import requests
import pandas as pd
from elasticsearch import Elasticsearch, helpers
import csv
#get vpc data and store in csv file
def get_vpc_sg(regionName,Token,projectId):
headers = {'X-Auth-Token': Token, "accept": "application/json"}
url= "https://vpc."+regionName+".g42cloud.com/v1/"+projectId+"/security-groups"
response = requests.get(url,headers=headers)
if response.status_code != 200:
print(response.status_code)
print("get vpc security group failed.")
data_json = json.loads(response.text)
data= data_json['security_groups']
print(data)
# Normalizing data
multiple_level_data = pd.json_normalize(data, record_path=['security_group_rules'],
meta=['id', 'name',
'description', 'enterprise_project_id'], meta_prefix='security_groups_',
record_prefix='security_group_rules_')
print(multiple_level_data)
# Saving to CSV format
multiple_level_data.to_csv('/tmp/multiplelevel_normalized_vpc_data.csv', index=False)
#load_csv to css
def load_csv_css(csv_file_path):
es = Elasticsearch(
[CSS集群Client IP],
#http_auth=('admin', ******),
# sniff before doing anything
sniff_on_start=True,
# refresh nodes after a node fails to respond
sniff_on_connection_fail=True,
# and also every 60 seconds
sniffer_timeout=60,
# set sniffing request timeout to 10 seconds
sniff_timeout=10
)
with open(csv_file_path, 'r',encoding='utf-8') as f:
reader = csv.DictReader(f)
helpers.bulk(es, reader, index='elb_lts_log')
def handler(event, context):
text = json.dumps(event["lts"]["data"])
ltsmsg = json.loads(text)
newdata = base64.b64decode(ltsmsg).decode('UTF-8')
data = json.loads(newdata)
strlogs = data['logs']
strjson = json.loads(strlogs)
print(strjson)
# # Saving to CSV format
multiple_level_data = pd.json_normalize(strjson)
# Saving to CSV format
multiple_level_data.to_csv('/tmp/elb_lst_log.csv', index=False)
#Normalizing data
# ltsstd_data = pd.json_normalize(ltsstd)
# aving to CSV format
# ltsstd_data.to_csv('/tmp/elb_lst_log.csv', index=False)
#host_name
#source_ip = ltsmsg["host_name"]
#hostId
#action = ltsmsg["hostId"]
#print(source_ip, action)
SMN_Topic = context.getUserData('SMN_Topic')
regionName = context.getUserData('RegionName')
Token=context.getToken()
#print(Token)
#projectId = context.getUserData('ProjectID')
projectId = context.getProjectID()
#print(projectId)
#get_vpc_sg(regionName,Token,projectId)
csv_file_path='/tmp/elb_lst_log.csv'
load_csv_css(csv_file_path)
if __name__ == '__main__':
handler(event,context)
Step5:创建CSS服务,并根据ELB日志输出格式创建Index
Step6:刷新ELB后台服务器NGINX服务,检测数据是否正常写入CSS
1.刷新网页,展现NGINX HTML首页
2.ELB日志正常记录NGINX server访问信息
3.LTS正确捕捉ELB日志信息
4.FunctionGraph LTS实时触发,并写入数据
5.CSS Kibana检测数据
实时捕捉ELB均衡负载后台服务器日志,函数流处理数据,通过ElasticSearch,Kibana数据展示,大功告成!!!