flow record flow-record1
match ipv4 source address
match ipv4 destination address
match transport source-port
match transport destination-port
collect counter bytes
collect counter packets
!
flow exporter flow-exporter1
destination 10.0.0.100
!
flow monitor flow-monitor1
exporter flow-exporter1
record flow-record1
!
interface GigabitEthernet1
ip flow monitor flow-monitor1 input
!
end
# goflow2 --help
Usage of goflow2:
-format string
Choose the format (available: json, pb, text)(default "json")
-format.hash string
List of fields to do hashing, separated by commas (default "SamplerAddress")
-format.protobuf.fixedlen
Prefix the protobuf with message length
-format.selector string
List of fields to do keep in output
-format.tag string
Use format tag
-listen string
listen addresses (default "sflow://:6343,netflow://:2055")
-logfmt string
Log formatter (default "normal")
-loglevel string
Log level (default "info")
-mapping string
Configuration file for custom mappings
-metrics.addr string
Metrics address (default ":8080")
-metrics.path string
Metrics path (default "/metrics")
-netflow.templates string
Choose the format (available: memory, file)(default "memory")
-netflow.templates.file.path string
Path of file to store templates (default "./templates.json")
-reuseport
Enable so_reuseport
-templates.path string
NetFlow/IPFIX templates list (default "/templates")
-transport string
Choose the transport (available: file, kafka)(default "file")
-transport.file string
File/console output (empty for stdout)
-transport.file.sep string
Line separator (default "\n")
-transport.kafka.brokers string
Kafka brokers list separated by commas (default "127.0.0.1:9092,[::1]:9092")
-transport.kafka.compression string
Kafka default compression
-transport.kafka.flushbytes int
Kafka flush bytes (default 104857600)
-transport.kafka.flushfreq duration
Kafka flush frequency (default 5s)
-transport.kafka.hashing
Enable partition hashing
-transport.kafka.log.err
Log Kafka errors
-transport.kafka.maxmsgbytes int
Kafka max message bytes (default 1000000)
-transport.kafka.sasl string
Use SASL to connect to Kafka, available settings: none, plain, scram-sha256, scram-sha512 (TLS is recommended and the environment variables KAFKA_SASL_USER and KAFKA_SASL_PASS need to be set)(default "none")
-transport.kafka.srv string
SRV record containing a list of Kafka brokers (or use brokers)
-transport.kafka.tls
Use TLS to connect to Kafka
-transport.kafka.topic string
Kafka topic to produce to (default "flow-messages")
-transport.kafka.version string
Kafka version (default "2.8.0")
-v Print version
-workers int
Number of workers per collector (default 1)