Send Sangfor NGAF Log to Elasticsearch

Newbie405830 Lv1Posted 17 Jan 2024 10:46

Hello,

Have anyone ever sent logs from Sangfor NGAF to Elasticsearch or Elk Stack? How to send sanfor NGAF log to elk.?

Please share your experince or any other option for syslog except kiwi syslog with best open source log server.?

Thanks

By solving this question, you may help 724 user(s).

Posting a reply earns you 2 coins. An accepted reply earns you 20 coins and another 10 coins for replying within 10 minutes. (Expired) What is Coin?

Enter your mobile phone number and company name for better service. Go

Rica Cortez Lv2Posted 29 Jan 2024 11:30
  
You can add the elastic search syslog. Just go to the syslog tab and add the neccesary information in adding the syslog server. You can also contact support team.
Happpy Lv3Posted 29 Jan 2024 11:28
  
For Security, Application Control, Traffic Audit, NAT, User Authentication, SSL VPN, Local ACL, and HA Error logs, choose Syslog as the Logging Location.
10.10.10.10 is the IP address and port number of the Syslog server.
Pat Lv4Posted 29 Jan 2024 11:27
  
Sending Sangfor NGAF logs to Elasticsearch or the ELK Stack is definitely doable, with several options available:

Using the Kiwi Syslog Plugin:

Official Plugin: Sangfor offers the official "Kiwi Syslog Plugin" for NGAF, specifically designed to integrate with the ELK Stack. It forwards logs formatted for Graylog2 (Sangfor's internal syslog server) to Elasticsearch.
Configuration: Configure the plugin within NGAF by specifying the Kiwi Syslog server address and port. On the ELK side, install the Graylog2 input plugin and map fields appropriately.
Directly via Logstash:

Logstash Configuration: Set up a Logstash instance on a separate server. Configure a File input to read NGAF log files (usually under /opt/Sangfor/logs). Apply Grok filters to parse the logs and extract desired fields. Finally, output the transformed data to Elasticsearch using the Elasticsearch output plugin.
Flexibility: This approach offers greater flexibility compared to the Kiwi plugin. You can customize parsing rules, enrich logs with additional data sources, and even route different log types to different ELK indexes.
Alternative Open-Source Syslog Servers:

Fluentd: Similar to Logstash, Fluentd can act as a log collector and forwarder. Configure a Fluentd agent to read NGAF logs, parse them with plugins, and send them to Elasticsearch.
rsyslog: While mainly a syslog server, rsyslog offers modules for forwarding logs to Elasticsearch. Configure rsyslog on your ELK server to collect NGAF logs via UDP or TCP with proper parsing rules.
Rizmae Lv2Posted 29 Jan 2024 11:26
  
Set up the server for Syslog and include an elasticsearch option and its other details and there you go.
Donsadam Posted 29 Jan 2024 11:24
  
Simply add Elk Stack or Elasticsearch's IP address to the "Logging and Archiving" page.
RegiBoy Lv5Posted 29 Jan 2024 11:24
  
It is typical practice for centralized log management to transmit Sangfor NGAF logs to Elasticsearch or the ELK Stack (Elasticsearch, Logstash, and Kibana). You may accomplish this by gathering, filtering, and forwarding Sangfor NGAF syslog data to Elasticsearch using Logstash. Use an output plugin to transmit the processed logs to Elasticsearch after configuring Logstash with an input plugin to receive syslog messages and filters to parse and arrange the data. Make that Elasticsearch's mappings are configured appropriately and that the required ports are open in order to receive the NGAF log data. For open-source substitutes for Kiwi Syslog, you may want to look at programs like Fluentd, Graylog, or syslog-ng, according on your needs and preferences.
mdamores Lv3Posted 25 Jan 2024 10:25
  
you can follow these steps when sending logs from Sangfor NGAF to ELF Stack

1. enable logging on Sangfor NGAF
   - log in to NGAF console
   - configure NGAF to send logs to a remote syslog server
2. Setup Syslog server
   - deploy syslog server that can receive logs from Sangfor NGAF. this can be the same server where ELK staff is installed or you can create a separate syslog server
3. Configure Logstash:
   - logstash is a log processing pipeline that can ingest logs from various sources, including syslog
   - configure logstash to receive logs from syslog server and process them, you may consider creating logstash input configurations to listen for incoming syslog messages
4. install Elasticsearch
   - install and configure Elasticsearch which will store and index the logs
5. Setup Kibana
   - install and configure Kibana for visualizing and analyzing the logs stored in Elasticsearch
6. Send Logs from Logstash to Elasticsearch
   - Configure Logstash to send processed logs to Elasticsearch
7. Testing and monitoring
   - verify and test if logs from Sangfor NGAF are reaching Elasticsearch
8. Troubleshooting
   - monitor logs and troubleshoot any issues you might encounter during integration process
   - if all else fail, you may try reaching out to Sangfor support for assistance
rivsy Lv5Posted 24 Jan 2024 09:26
  
Just add the IP address Elk Stack or Elasticsearch to the "Logging and Archiving" tab
Tayyab0101 Lv2Posted 23 Jan 2024 19:37
  
Configure the Syslog server.
and add option for elastic search..
Enrico Vanzetto Lv3Posted 23 Jan 2024 18:42
  
Hi, you can achieve this by getting syslog logs from ngaf.

I Can Help:

Change

Moderator on This Board

0
2
4

Started Topics

Followers

Follow

67
14
3

Started Topics

Followers

Follow

3
0
2

Started Topics

Followers

Follow

1
131
3

Started Topics

Followers

Follow

Board Leaders