基于FLink實現(xiàn)實時安全檢測的示例代碼
研發(fā)背景
公司安全部目前針對內(nèi)部系統(tǒng)的網(wǎng)絡(luò)訪問日志的安全審計,大部分都是T+1時效,每日當天,啟動Python編寫的定時任務(wù),完成昨日的日志審計和檢測,定時任務(wù)運行完成后,統(tǒng)一進行企業(yè)微信告警推送。這種方案在目前的網(wǎng)絡(luò)環(huán)境和人員規(guī)模下,呈現(xiàn)兩個痛點,一是面對日益頻繁的網(wǎng)絡(luò)攻擊、釣魚鏈接,T+1的定時任務(wù),難以及時進行告警,因此也難以有效避免如關(guān)鍵信息泄露等問題,二是目前以Python為主的單機定時任務(wù),針對不同場景的處理時效,從一小時到十幾小時不等,效率低下。為解決以上問題,本人協(xié)助公司安全部同時對告警采集平臺進行改造,由之前的python單機任務(wù)處理,切換到基于Flink集群的并行處理,且告警推送時效,由之前的T+1天,提升到秒級實時告警。本次改造涉及網(wǎng)絡(luò)日志審計的多個常見場景,如端口掃描、黑名單統(tǒng)計、異常流量、連續(xù)惡意登錄等。本次以一段時間內(nèi)連續(xù)登錄失敗20次后,下一次登錄成功場景來進行介紹。
場景描述
針對一個內(nèi)部系統(tǒng),如郵件系統(tǒng),公司員工的訪問行為日志,存放于kafka,我們希望對于一個用戶賬號在同一個IP下,任意的3分鐘時間內(nèi),連續(xù)登錄郵件系統(tǒng)20次失敗,下一次登錄成功,這種場景能夠及時獲取并推送到企業(yè)微信某個指定的安全接口人。kafka中的數(shù)據(jù),能夠通過某個關(guān)鍵字,區(qū)分當前網(wǎng)絡(luò)訪問是否一次登錄事件,且有訪問時間(也就是事件時間)。在解析到符合需求的用戶賬號之后,第一時間進行企業(yè)微信告警推送,并將其這段時間內(nèi)的訪問行為,寫入下游ElasticSearch。
組件版本
- Flink-1.14.4
- Java8
- ElasticSearch-7.3.2
- Kafka-2.12_2.8.1
日志結(jié)構(gòu)
IP和賬號皆為測試使用。
{
"user": "wangxm",
"client_ip": "110.68.6.182",
"source": "login",
"loginname": "wangxm@test.com",
"IP": "110.8.148.58",
"timestamp": "17:58:12",
"@timestamp": "2022-04-20T09:58:13.647Z",
"ip": "110.7.231.25",
"clienttype": "POP3",
"result": "success",
"@version": "1"
}技術(shù)方案
上述場景,可考慮使用FlinkCEP及Flink的滑動窗口進行實現(xiàn)。由于本人在采用FlinkCEP的方案進行代碼編寫調(diào)試后,發(fā)現(xiàn)并不能滿足,因此改用滑動窗口進行實現(xiàn)。
關(guān)鍵代碼
主入口類
主入口類,創(chuàng)建了flink環(huán)境、設(shè)置了基礎(chǔ)參數(shù),創(chuàng)建了kafkaSource,接入消息后,進行了映射、過濾,并設(shè)置了水位線,進行了分組,之后設(shè)置了滑動窗口,在窗口內(nèi)進行了事件統(tǒng)計,將復合條件的事件收集返回并寫入ElasticSearch。
針對map、filter、keyBy、window等算子,都單獨進行了編寫,后面會一一列出來。
package com.data.dev.flink.mailTopic.main;
import com.data.dev.common.javabean.BaseBean;
import com.data.dev.common.javabean.kafkaMailTopic.MailMsgAlarm;
import com.data.dev.elasticsearch.ElasticSearchInfo;
import com.data.dev.elasticsearch.SinkToEs;
import com.data.dev.flink.FlinkEnv;
import com.data.dev.flink.mailTopic.OperationForLoginFailCheck.*;
import com.data.dev.kafka.KafkaSourceBuilder;
import com.data.dev.key.ConfigurationKey;
import com.data.dev.utils.TimeUtils;
import lombok.extern.slf4j.Slf4j;
import org.apache.flink.api.common.eventtime.WatermarkStrategy;
import org.apache.flink.connector.kafka.source.KafkaSource;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.datastream.KeyedStream;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.datastream.WindowedStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.windowing.assigners.SlidingEventTimeWindows;
import org.apache.flink.streaming.api.windowing.time.Time;
import org.apache.flink.streaming.api.windowing.windows.TimeWindow;
import java.time.Duration;
/**
* Flink處理在3分鐘內(nèi)連續(xù)登錄失敗20次后登錄成功的場景
* 采用滑動窗口來實現(xiàn)
* @author wangxiaomin 2022-06-01
*/
@Slf4j
public class MailMsg extends BaseBean {
/**
* Flink作業(yè)名稱
*/
public static final String JobName = "告警采集平臺——連續(xù)登錄失敗后登錄成功告警";
/**
* Kafka消息名
*/
public static final String KafkaSourceName = "Kafka Source for AlarmPlatform About Mail Topic";
public MailMsg(){
log.info("初始化滑動窗口場景告警程序");
}
/**
* 執(zhí)行邏輯統(tǒng)計場景,實現(xiàn)告警推送
*/
public static void execute(){
//① 創(chuàng)建Flink執(zhí)行環(huán)境并設(shè)置checkpoint等必要的參數(shù)
StreamExecutionEnvironment env = FlinkEnv.getFlinkEnv();
KafkaSource<String> kafkaSource = KafkaSourceBuilder.getKafkaSource(ConfigurationKey.KAFKA_MAIL_TOPIC_NAME,ConfigurationKey.KAFKA_MAIL_CONSUMER_GROUP_ID) ;
DataStreamSource<String> kafkaMailMsg = env.fromSource(kafkaSource, WatermarkStrategy.forBoundedOutOfOrderness(Duration.ofMillis(10)), KafkaSourceName);
//② 篩選登錄消息,創(chuàng)建初始登錄事件流
SingleOutputStreamOperator<com.data.dev.common.javabean.kafkaMailTopic.MailMsg> loginMapDs = kafkaMailMsg.map(new MsgToBeanMapper()).name("Map算子加工");
SingleOutputStreamOperator<com.data.dev.common.javabean.kafkaMailTopic.MailMsg> loginFilterDs = loginMapDs.filter(new MailMsgForLoginFilter()).name("Filter算子加工");
//③ 設(shè)置水位線
WatermarkStrategy<com.data.dev.common.javabean.kafkaMailTopic.MailMsg> watermarkStrategy = WatermarkStrategy.<com.data.dev.common.javabean.kafkaMailTopic.MailMsg>forBoundedOutOfOrderness(Duration.ofMinutes(1))
.withTimestampAssigner((mailMsg, timestamp) -> TimeUtils.switchUTCToBeijingTimestamp(mailMsg.getTimestamp_datetime()));
SingleOutputStreamOperator<com.data.dev.common.javabean.kafkaMailTopic.MailMsg> loginWmDs = loginFilterDs.assignTimestampsAndWatermarks(watermarkStrategy.withIdleness(Duration.ofMinutes(3))).name("增加水位線");
//④ 設(shè)置主鍵
KeyedStream<com.data.dev.common.javabean.kafkaMailTopic.MailMsg, String> loginKeyedDs = loginWmDs.keyBy(new LoginKeySelector());
//⑥ 轉(zhuǎn)化為滑動窗口
WindowedStream<com.data.dev.common.javabean.kafkaMailTopic.MailMsg, String, TimeWindow> loginWindowDs = loginKeyedDs.window(SlidingEventTimeWindows.of(Time.seconds(180L),Time.seconds(90L)));
//⑦ 在窗口內(nèi)進行邏輯統(tǒng)計
SingleOutputStreamOperator<MailMsgAlarm> loginWindowsDealDs = loginWindowDs.process(new WindowProcessFuncImpl()).name("窗口處理邏輯");
//⑧ 將結(jié)果轉(zhuǎn)化為通用DataStream<String>格式
SingleOutputStreamOperator<String> resultDs = loginWindowsDealDs.map(new AlarmMsgToStringMapper()).name("窗口結(jié)果轉(zhuǎn)化為標準格式");
//⑨ 將最終結(jié)果寫入ES
resultDs.addSink(SinkToEs.getEsSinkBuilder(ElasticSearchInfo.ES_LOGIN_FAIL_INDEX_NAME,ElasticSearchInfo.ES_INDEX_TYPE_DEFAULT).build());
//⑩ 提交Flink集群進行執(zhí)行
FlinkEnv.envExec(env,JobName);
}
}mapper算子
package com.data.dev.flink.mailTopic.OperationForLoginFailCheck;
import com.alibaba.fastjson.JSON;
import com.data.dev.common.javabean.BaseBean;
import com.data.dev.common.javabean.kafkaMailTopic.MailMsgAlarm;
import lombok.extern.slf4j.Slf4j;
import org.apache.flink.api.common.functions.MapFunction;
/**
* 邏輯統(tǒng)計場景告警推送ES消息體
* @author wangxiaoming-ghq 2022-06-01
*/
@Slf4j
public class AlarmMsgToStringMapper extends BaseBean implements MapFunction<MailMsgAlarm, String> {
@Override
public String map(MailMsgAlarm mailMsgAlarm) throws Exception {
return JSON.toJSONString(mailMsgAlarm);
}
}filter算子
package com.data.dev.flink.mailTopic.OperationForLoginFailCheck;
import com.data.dev.common.javabean.BaseBean;
import com.data.dev.common.javabean.kafkaMailTopic.MailMsg;
import lombok.extern.slf4j.Slf4j;
import org.apache.flink.api.common.functions.FilterFunction;
/**
* ② 消費mail主題的消息,過濾其中l(wèi)ogin的事件
* @author wangxiaoming-ghq 2022-06-01
*/
@Slf4j
public class MailMsgForLoginFilter extends BaseBean implements FilterFunction<MailMsg> {
@Override
public boolean filter(MailMsg mailMsg) {
if("login".equals(mailMsg.getSource())) {
log.info("篩選原始的login事件:【" + mailMsg + "】");
}
return "login".equals(mailMsg.getSource());
}
}keyBy算子
package com.data.dev.flink.mailTopic.OperationForLoginFailCheck;
import com.data.dev.common.javabean.BaseBean;
import com.data.dev.common.javabean.kafkaMailTopic.MailMsg;
import lombok.extern.slf4j.Slf4j;
import org.apache.flink.api.java.functions.KeySelector;
/**
* CEP 編程,需要進行key選取
*/
@Slf4j
public class LoginKeySelector extends BaseBean implements KeySelector<MailMsg, String> {
@Override
public String getKey(MailMsg mailMsg) {
return mailMsg.getUser() + "@" + mailMsg.getClient_ip();
}
}窗口函數(shù)(核心代碼)
這里我們主要考慮使用一個事件列表,用來存儲每一個窗口期內(nèi)得到的連續(xù)登錄,當檢測到登陸失敗的事件,即存入事件列表中,之后判斷下一次登錄失敗事件,如果檢測到登錄成功事件,但此時登錄失敗的次數(shù)不足20次,則清空loginEventList,等待下一次檢測。一旦符合窗口內(nèi)連續(xù)登錄失敗超過20次且下一次登錄成功這個事件,則清空此時的loginEventList并將當前登錄成功的事件進行告警推送。
package com.data.dev.flink.mailTopic.OperationForLoginFailCheck;
import com.data.dev.common.javabean.kafkaMailTopic.MailMsg;
import com.data.dev.common.javabean.kafkaMailTopic.MailMsgAlarm;
import com.data.dev.utils.HttpUtils;
import com.data.dev.utils.IPUtils;
import lombok.extern.slf4j.Slf4j;
import org.apache.flink.streaming.api.functions.windowing.ProcessWindowFunction;
import org.apache.flink.streaming.api.windowing.windows.TimeWindow;
import org.apache.flink.util.Collector;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.List;
/**
* 滑動窗口內(nèi)復雜事件解析邏輯實現(xiàn)
* @author wangxiaoming-ghq 2022-06-01
*/
@Slf4j
public class WindowProcessFuncImpl extends ProcessWindowFunction<MailMsg, MailMsgAlarm, String, TimeWindow> implements Serializable {
@Override
public void process(String key, ProcessWindowFunction<MailMsg, MailMsgAlarm, String, TimeWindow>.Context context, Iterable<MailMsg> iterable, Collector<MailMsgAlarm> collector) {
List<MailMsg> loginEventList = new ArrayList<>();
MailMsgAlarm mailMsgAlarm;
for (MailMsg mailMsg : iterable) {
log.info("收集到的登錄事件【" + mailMsg + "】");
if (mailMsg.getResult().equals("fail")) { //開始檢測當前窗口內(nèi)的事件,并將失敗的事件收集到loginEventList
log.info("開始檢測當前窗口內(nèi)的事件,并將失敗的事件收集到loginEventList");
loginEventList.add(mailMsg);
} else if (mailMsg.getResult().equals("success") && loginEventList.size() < 20) {//如果檢測到登錄成功事件,但此時登錄失敗的次數(shù)不足20次,則清空loginEventList,等待下一次檢測
log.info("檢測到登錄成功事件,但此時登錄失敗的次數(shù)為【" + loginEventList.size() + "】不足20次,清空loginEventList,等待下一次檢測");
loginEventList.clear();
} else if (mailMsg.getResult().equals("success") && loginEventList.size() >= 20) {
mailMsgAlarm = getMailMsgAlarm(loginEventList,mailMsg);
log.info("檢測到登錄成功的事件,此時窗口內(nèi)連續(xù)登錄失敗的次數(shù)為【" + mailMsgAlarm.getFailTimes() + "】");
//一旦符合窗口內(nèi)連續(xù)登錄失敗超過20次且下一次登錄成功這個事件,則清空此時的loginEventList并將當前登錄成功的事件進行告警推送;
loginEventList.clear();
doAlarmPush(mailMsgAlarm);
collector.collect(mailMsgAlarm);//將當前登錄成功的事件進行收集上報
} else {
log.info(mailMsg.getUser() + "當前已連續(xù):【" + loginEventList.size() + "】 次登錄失敗");
}
}
}
/**
* 2022年6月17日15:03:06
* @param eventList:當前窗口內(nèi)的事件列表
* @param eventCurrent:當前登錄成功的事件
* @return mailMsgAlarm:告警消息體
*/
public static MailMsgAlarm getMailMsgAlarm(List<MailMsg> eventList,MailMsg eventCurrent){
String alarmKey = eventCurrent.getUser() + "@" + eventCurrent.getClient_ip();
String loginFailStartTime = eventList.get(0).getTimestamp_datetime();
String loginSuccessTime = eventCurrent.getTimestamp_datetime();
int loginFailTimes = eventList.size();
MailMsgAlarm mailMsgAlarm = new MailMsgAlarm();
mailMsgAlarm.setMailMsg(eventCurrent);
mailMsgAlarm.setAlarmKey(alarmKey);
mailMsgAlarm.setStartTime(loginFailStartTime);
mailMsgAlarm.setEndTime(loginSuccessTime);
mailMsgAlarm.setFailTimes(loginFailTimes);
return mailMsgAlarm;
}
/**
* 2022年6月17日14:47:53
* @param mailMsgAlarm :當前構(gòu)建的需要告警的事件
*/
public void doAlarmPush(MailMsgAlarm mailMsgAlarm){
String userKey = mailMsgAlarm.getAlarmKey();
String clientIp = mailMsgAlarm.mailMsg.getClient_ip();
boolean isWhiteListIp = IPUtils.isWhiteListIp(clientIp);
if(isWhiteListIp){//如果是白名單IP,不告警
log.info("當前登錄用戶【" + userKey + "】屬于白名單IP");
}else {
//IP歸屬查詢結(jié)果、企業(yè)微信推送告警
String user = HttpUtils.getUserByClientIp(clientIp);
HttpUtils.pushAlarmMsgToWechatWork(user,mailMsgAlarm.toString());
}
}
}最后一次map算子
package com.data.dev.flink.mailTopic.OperationForLoginFailCheck;
import com.alibaba.fastjson.JSON;
import com.data.dev.common.javabean.BaseBean;
import com.data.dev.common.javabean.kafkaMailTopic.MailMsgAlarm;
import lombok.extern.slf4j.Slf4j;
import org.apache.flink.api.common.functions.MapFunction;
/**
* 邏輯統(tǒng)計場景告警推送ES消息體
* @author wangxiaoming-ghq 2022-06-01
*/
@Slf4j
public class AlarmMsgToStringMapper extends BaseBean implements MapFunction<MailMsgAlarm, String> {
@Override
public String map(MailMsgAlarm mailMsgAlarm) throws Exception {
return JSON.toJSONString(mailMsgAlarm);
}
}ElasticSearch工具類
package com.data.dev.elasticsearch;
import com.data.dev.common.javabean.BaseBean;
import com.data.dev.key.ConfigurationKey;
import com.data.dev.key.ElasticSearchKey;
import lombok.extern.slf4j.Slf4j;
import org.apache.flink.api.common.functions.RuntimeContext;
import org.apache.flink.streaming.connectors.elasticsearch.ElasticsearchSinkFunction;
import org.apache.flink.streaming.connectors.elasticsearch.RequestIndexer;
import org.apache.flink.streaming.connectors.elasticsearch7.ElasticsearchSink;
import org.apache.flink.streaming.connectors.elasticsearch7.RestClientFactory;
import org.apache.http.HttpHost;
import org.apache.http.auth.AuthScope;
import org.apache.http.auth.UsernamePasswordCredentials;
import org.apache.http.client.CredentialsProvider;
import org.apache.http.impl.client.BasicCredentialsProvider;
import org.elasticsearch.action.index.IndexRequest;
import org.elasticsearch.client.Requests;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
/**
* 2022年6月17日15:15:06
* @author wangxiaoming-ghq
* Flink流計算結(jié)果寫入ES公共方法
*/
@Slf4j
public class SinkToEs extends BaseBean {
public static final long serialVersionUID = 2L;
private static final HashMap<String,String> ES_PROPS_MAP = ConfigurationKey.getApplicationProps();
private static final String HOST = ES_PROPS_MAP.get(ConfigurationKey.ES_HOST);
private static final String PASSWORD = ES_PROPS_MAP.get(ConfigurationKey.ES_PASSWORD);
private static final String USERNAME = ES_PROPS_MAP.get(ConfigurationKey.ES_USERNAME);
private static final String PORT = ES_PROPS_MAP.get(ConfigurationKey.ES_PORT);
/**
* 2022年6月17日15:17:55
* 獲取ES連接信息
* @return esInfoMap:ES連接信息持久化
*/
public static HashMap<String,String > getElasticSearchInfo(){
log.info("獲取ES連接信息:【 " + "HOST="+HOST + "PORT="+PORT+"USERNAME="+USERNAME+"PASSWORD=********" + " 】");
HashMap<String,String> esInfoMap = new HashMap<>();
esInfoMap.put(ElasticSearchKey.HOST,HOST);
esInfoMap.put(ElasticSearchKey.PASSWORD,PASSWORD);
esInfoMap.put(ElasticSearchKey.USERNAME,USERNAME);
esInfoMap.put(ElasticSearchKey.PORT,PORT);
return esInfoMap;
}
/**
* @param esIndexName:寫入索引名稱
* @param esType:寫入索引類型
* @return ElasticsearchSink.Builder<String>:構(gòu)建器
*/
public static ElasticsearchSink.Builder<String> getEsSinkBuilder(String esIndexName,String esType){
HashMap<String, String> esInfoMap = getElasticSearchInfo();
List<HttpHost> httpHosts = new ArrayList<>();
httpHosts.add(new HttpHost(String.valueOf(esInfoMap.get(ElasticSearchKey.HOST)), Integer.parseInt(esInfoMap.get(ElasticSearchKey.PORT)), "http"));
ElasticsearchSink.Builder<String> esSinkBuilder = new ElasticsearchSink.Builder<>(
httpHosts,
new ElasticsearchSinkFunction<String>() {
public IndexRequest createIndexRequest() {
Map<String, String> json = new HashMap<>();
//log.info("寫入ES的data:【"+json+"】");
IndexRequest index = Requests.indexRequest()
.index(esIndexName)
.type(esType)
.source(json);
return index;
}
@Override
public void process(String element, RuntimeContext ctx, RequestIndexer indexer) {
indexer.add(createIndexRequest());
}
}
);
//定義es的連接配置 帶用戶名密碼
RestClientFactory restClientFactory = restClientBuilder -> {
CredentialsProvider credentialsProvider = new BasicCredentialsProvider();
credentialsProvider.setCredentials(
AuthScope.ANY,
new UsernamePasswordCredentials(
String.valueOf(esInfoMap.get(ElasticSearchKey.USERNAME)),
String.valueOf(esInfoMap.get(ElasticSearchKey.PASSWORD))
)
);
restClientBuilder.setHttpClientConfigCallback(httpAsyncClientBuilder -> {
httpAsyncClientBuilder.disableAuthCaching();
return httpAsyncClientBuilder.setDefaultCredentialsProvider(credentialsProvider);
});
};
esSinkBuilder.setRestClientFactory(restClientFactory);
return esSinkBuilder;
}
}事件實體類
package com.data.dev.common.javabean.kafkaMailTopic;
import com.data.dev.common.javabean.BaseBean;
import lombok.Data;
import java.util.Objects;
/**
* @author wangxiaoming-ghq 2022-05-15
* 邏輯統(tǒng)計場景告警事件
*/
@Data
public class MailMsgAlarm extends BaseBean {
/**
* 當前登錄成功的事件
*/
public MailMsg mailMsg;
/**
* 當前捕獲的告警主鍵:username@client_ip
*/
public String alarmKey;
/**
* 第一次登錄失敗的事件時間
*/
public String startTime;
/**
* 連續(xù)登錄失敗后下一次登錄成功的事件時間
*/
public String endTime;
/**
* 連續(xù)登錄失敗的次數(shù)
*/
public int failTimes;
@Override
public String toString() {
return "{" +
" 'mailMsg_login_success':'" + mailMsg + "'" +
", 'alarmKey':'" + alarmKey + "'" +
", 'start_login_time_in3min':'" +startTime + "'" +
", 'end_login_time_in3min':'" +endTime + "'" +
", 'login_fail_times':'" +failTimes + "'" +
"}";
}
public MailMsgAlarm() {
}
@Override
public boolean equals(Object o) {
if (this == o) return true;
if (!(o instanceof MailMsgAlarm)) return false;
MailMsgAlarm that = (MailMsgAlarm) o;
return getFailTimes() == that.getFailTimes() && getMailMsg().equals(that.getMailMsg()) && getAlarmKey().equals(that.getAlarmKey()) && getStartTime().equals(that.getStartTime()) && getEndTime().equals(that.getEndTime());
}
@Override
public int hashCode() {
return Objects.hash(getMailMsg(), getAlarmKey(), getStartTime(), getEndTime(), getFailTimes());
}
}消息實體類
package com.data.dev.common.javabean.kafkaMailTopic;
import com.data.dev.common.javabean.BaseBean;
import lombok.Data;
import java.util.Objects;
/**
* {
* "user": "wangxm",
* "client_ip": "110.68.6.182",
* "source": "login",
* "loginname": "wangxm@test.com",
* "IP": "110.8.148.58",
* "timestamp": "17:58:12",
* "@timestamp": "2022-04-20T09:58:13.647Z",
* "ip": "110.7.231.25",
* "clienttype": "POP3",
* "result": "success",
* "@version": "1"
* }
*
* user登錄用戶
* client_ip 來源ip
* source 類型
* loginname 登錄用戶郵箱地址
* ip 目標前端ip
* timestamp 發(fā)送時間
* @timestamp 發(fā)送日期時間
* IP 郵件日志發(fā)送來源IP
* clienttype 客戶端登錄類型
* result 登錄狀態(tài)
*/
@Data
public class MailMsg extends BaseBean {
public String user;
public String client_ip;
public String source;
public String loginName;
public String mailSenderSourceIp;
public String timestamp_time;
public String timestamp_datetime;
public String ip;
public String clientType;
public String result;
public String version;
public MailMsg() {
}
public MailMsg(String user, String client_ip, String source, String loginName, String mailSenderSourceIp, String timestamp_time, String timestamp_datetime, String ip, String clientType, String result, String version) {
this.user = user;
this.client_ip = client_ip;
this.source = source;
this.loginName = loginName;
this.mailSenderSourceIp = mailSenderSourceIp;
this.timestamp_time = timestamp_time;
this.timestamp_datetime = timestamp_datetime;
this.ip = ip;
this.clientType = clientType;
this.result = result;
this.version = version;
}
@Override
public boolean equals(Object o) {
if (this == o) return true;
if (!(o instanceof MailMsg)) return false;
MailMsg mailMsg = (MailMsg) o;
return getUser().equals(mailMsg.getUser()) && getClient_ip().equals(mailMsg.getClient_ip()) && getSource().equals(mailMsg.getSource()) && getLoginName().equals(mailMsg.getLoginName()) && getMailSenderSourceIp().equals(mailMsg.getMailSenderSourceIp()) && getTimestamp_time().equals(mailMsg.getTimestamp_time()) && getTimestamp_datetime().equals(mailMsg.getTimestamp_datetime()) && getIp().equals(mailMsg.getIp()) && getClientType().equals(mailMsg.getClientType()) && getResult().equals(mailMsg.getResult()) && getVersion().equals(mailMsg.getVersion());
}
@Override
public int hashCode() {
return Objects.hash(getUser(), getClient_ip(), getSource(), getLoginName(), getMailSenderSourceIp(), getTimestamp_time(), getTimestamp_datetime(), getIp(), getClientType(), getResult(), getVersion());
}
@Override
public String toString() {
return "{" +
" 'user':'" + user + "'" +
", 'client_ip':'" + client_ip + "'" +
", 'source':'" + source + "'" +
", 'loginName':'" + loginName + "'" +
", 'IP':'" + mailSenderSourceIp + "'" +
", 'timestamp':'" + timestamp_time + "'" +
", '@timestamp':'" + timestamp_datetime + "'" +
", 'ip':'" + "'" +
", 'clientType':'" + clientType + "'" +
", 'result':'" + result + "'" +
", 'version':'" + version + "'" +
"}";
}
}源代碼已去掉敏感信息,地址:https://gitee.com/wangxm-2270/alarmCollectByFlink.git
到此這篇關(guān)于基于FLink實現(xiàn)實時安全檢測的示例代碼的文章就介紹到這了,更多相關(guān)FLink實時安全檢測內(nèi)容請搜索腳本之家以前的文章或繼續(xù)瀏覽下面的相關(guān)文章希望大家以后多多支持腳本之家!
相關(guān)文章
mybatis-parameterType傳入map條件方式
這篇文章主要介紹了mybatis-parameterType傳入map條件方式,具有很好的參考價值,希望對大家有所幫助,如有錯誤或未考慮完全的地方,望不吝賜教2023-12-12
說說@ModelAttribute在父類和子類中的執(zhí)行順序
這篇文章主要介紹了@ModelAttribute在父類和子類中的執(zhí)行順序,具有很好的參考價值,希望對大家有所幫助。如有錯誤或未考慮完全的地方,望不吝賜教2021-06-06
SpringBoot中動態(tài)更新@Value配置方式
這篇文章主要介紹了SpringBoot中動態(tài)更新@Value配置方式,具有很好的參考價值,希望對大家有所幫助,如有錯誤或未考慮完全的地方,望不吝賜教2023-09-09
ConcurrentHashMap是如何實現(xiàn)線程安全的你知道嗎
這篇文章主要介紹了ConcurrentHashMap是如何實現(xiàn)線程安全的你知道嗎,具有很好的參考價值,希望對大家有所幫助。如有錯誤或未考慮完全的地方,望不吝賜教2022-10-10
SpringBoot2整合Ehcache組件實現(xiàn)輕量級緩存管理
EhCache是一個純Java的進程內(nèi)緩存框架,具有快速、上手簡單等特點,是Hibernate中默認的緩存提供方。本文講述下SpringBoot2 整合Ehcache組件的步驟2021-06-06

