1. 程式人生 > 其它 >億級流量電商詳情頁系統實戰-36.基於nginx+lua完成商品詳情頁訪問流量實時上報kafka的開發

億級流量電商詳情頁系統實戰-36.基於nginx+lua完成商品詳情頁訪問流量實時上報kafka的開發

技術標籤:# 商城

1.前言

在nginx這一層,接收到訪問請求的時候,就把請求的流量上報傳送給kafka。

這樣的話,storm才能去消費kafka中的實時的訪問日誌,然後去進行快取熱資料的統計。

用得技術方案非常簡單,從lua指令碼直接建立一個kafka producer,傳送資料到kafka。

2.配置

2.1下載ua-resty-kafka

在這裡插入圖片描述

#cd /usr/servers/lualib/resty
#wget https://github.com/doujiang24/lua-resty-kafka/archive/v0.07.zip
#yum install -y unzip
#unzip v0.07.zip
#cp -rf lua-resty-kafka-0.07/lib/resty/kafka .

2.2 編寫相關lua kafka producer

# vi /usr/lua/eshop/lua/product.lua

local cjson = require("cjson")  
local producer = require("resty.kafka.producer")  

local broker_list = {  
    { host = "192.168.135.135", port = 9092 },  
    { host =
"192.168.135.132", port = 9092 }, { host = "192.168.135.136", port = 9092 } } local log_json = {} log_json["request_module"] = "product_detail_info" log_json["headers"] = ngx.req.get_headers() log_json["uri_args"] = ngx.req.get_uri_args(
) log_json["body"] = ngx.req.read_body() log_json["http_version"] = ngx.req.http_version() log_json["method"] =ngx.req.get_method() log_json["raw_reader"] = ngx.req.raw_header() log_json["body_data"] = ngx.req.get_body_data() local message = cjson.encode(log_json); local productId = ngx.req.get_uri_args()["productId"] local async_producer = producer:new(broker_list, { producer_type = "async" }) local ok, err = async_producer:send("access-log", productId, message) if not ok then ngx.log(ngx.ERR, "kafka send err:", err) return end

2.3 使nginx配置生效

--增加 resolver 8.8.8.8
# vi /usr/servers/nginx/conf/nginx.conf 
http {
    resolver 8.8.8.8;
    include       mime.types;
    default_type  application/octet-stream;

#/usr/servers/nginx/sbin/nginx -s reload

2.4 配置kafka

--設定advertised.host.name
# vi /usr/local/kafka/config/server.properties

--重啟kafka程序
# jps
1215 Kafka
#kill 1215
#nohup /usr/local/kafka/bin/kafka-server-start.sh /usr/local/kafka/config/server.properties &

--增加access-log topic
# /usr/local/kafka/bin/kafka-topics.sh --zookeeper 192.168.135.135:2181,192.168.135.132:2181,192.168.135.136:2181 --topic  access-log --replication-factor 1 --partitions 1 --create

# /usr/local/kafka/bin/kafka-console-consumer.sh --zookeeper 192.168.135.135:2181,192.168.135.132:2181,192.168.135.136:2181 --topic access-log --from-beginning

3.測試

  1. 啟動eshop-cache快取服務
  2. http://192.168.135.135/eshop?method=product&productId=1&shopId=1