kafka-inlet create


ockam kafka-inlet create [OPTIONS]

Create a new Kafka Inlet. Kafka clients v3.7.0 and earlier are supported. You can find the version you have with 'kafka-topics.sh --version'

Options

  • --at [NODE_NAME] (optional)
    Perform the command on the given node

  • --addr [ADDR] (optional)
    The local address of the service

  • --from [FROM] (optional)
    The address where to bind and where the client will connect to alongside its port,

    :. In case just a port is specified, the default loopback address (127.0.0.1:4000) will be used

  • --brokers-port-range [BROKERS_PORT_RANGE] (optional)
    Local port range dynamically allocated to kafka brokers, must not overlap with the bootstrap port

  • --to [ROUTE] (optional)
    The route to the Kafka outlet node, either the project in ockam orchestrator or a rust node, expected something like /project/. Use self when the Kafka outlet is local

  • --consumer [ROUTE] (optional)
    The direct route to a single Kafka consumer node instead of using a relay for their resolution. A single encryption key will be exchanged with the provided consumer

  • --consumer-relay [ROUTE] (optional)
    The route to the Kafka consumer relay node. Encryption keys will be exchanged passing through this relay based on topic and partition name. By default, this parameter uses the value of to

  • --publishing-relay [ROUTE] (optional)
    The route to the Kafka consumer relay node which will be used to make this consumer available to producers. By default, this parameter uses the value of consumer-relay

  • --avoid-publishing (optional)
    Avoid publishing the consumer in the relay. This is useful to avoid the creation of an unused relay when the consumer is directly referenced by the producer

  • --disable-content-encryption (optional)
    Disable end-to-end kafka messages encryption between producer and consumer. Use it when you want a plain kafka portal, the communication itself will still be encrypted

  • --encrypted-field [FIELD] (optional)
    The fields to encrypt in the kafka messages, assuming the record is a valid JSON map. By default, the whole record is encrypted

  • --allow [INLET-EXPRESSION] (optional)
    Policy expression that will be used for access control to the Kafka Inlet. If you don't provide it, the policy set for the "tcp-inlet" resource type will be used

  • --allow-consumer [CONSUMER-EXPRESSION] (optional)
    Policy expression that will be used for access control to the Kafka Consumer. If you don't provide it, the policy set for the "kafka-consumer" resource type will be used

  • --allow-producer [PRODUCER-EXPRESSION] (optional)
    Policy expression that will be used for access control to the Kafka Producer. If you don't provide it, the policy set for the "kafka-producer" resource type will be used