Integrating SAP Commerce with Apache Kafka will produce a real-time data experience for the e-commerce functionalities. The real question here is In what circumstances do these two entities need integration and if so how can we achieve that?
I assume those who are reading this page already know about SAP Commerce and Kafka. But even if you enter here by roaming around through the web, then this section is for you.
Apache Kafka is an open-source distributed event store and stream-processing platform. That sounds pretty good but what does that mean for an e-commerce platform?
Let’s give it a deep dive, and consider a headless e-commerce application. Upon checkout, the platform wants to do the following:
The above-mentioned integrations can become complicated when they start growing.
I have worked with many projects with a similar pattern and we made these integrations, but it needs a lot of patience and proper architecture to achieve this.
Here we are examining a scenario where each service is managed by a distinct team, which will cause complexity, because:
Normal Scenario
Now let’s consider Kafka in this situation,
Here the service that is responsible for the checkout will emit an event or message about the checkout via Kafka, then it can forget and not be much bothered about who is listening and what is happening further. The other services then should listen to Kafka and read these events and then use them for their purposes.
The advantages will be,
With Kafka
There are a lot of use cases of Kafka and my topic here is a little different from explaining Kafka.
A trusted e-commerce platform that can help you innovate at scale and tap enterprise-wide data to boost profits and customer satisfaction. I don’t think I need to explain it more like Kafka here.
I initially thought the same, but once I (as an SAP CC expert) understood Kafka, it made sense. There are a few scenarios apart from the above-mentioned ones, in which it needs to happen, for example:
A mandatory knowledge about Kafka, if you still don’t have it then let me explain a few of the main terminologies used in this document.
Producers
Producers are the one who produces (create) the data and push it to Kafka for the consumers to consume (read) it.
Consumers
Consumers, (I think I spoiled it already) are the ones who take the data from Kafka and do whatever it is supposed to do with the data
Topics
Topics can be said to be an ordered list of events (data). This makes sense because in normal work each Kafka system will deal with a lot of topics and a lot of producers and consumers. So a producer can push its event (data) to a topic and a consumer who is subscribed to that topic will read the event (data) automatically when the event(data) reaches the topic. This will now make sure the other consumers and producers who use the same Kafka don’t interfere with each other.
Consumer Groups
Consumer groups are nothing but a group of consumers and it will make the parallel processing of event messages.
Apart from all these, it would help if you also had a running Kafka setup.
Refer to quickstart or you can find a ton of materials on YouTube that help you with the setups.
Let’s now discuss the integration.
There are two ways we can integrate SAP CC with Kakfa:
I am here to explain the approach with spring.
Setup
1. Add dependencies
SAP CC comes with Spring but that doesn’t include the Spring Kafka packages, so we have to include them as maven dependencies in your extensions>external-dependencies.xml.
Also, make sure your usemaven=true in your extensionsinfo.xml
Artifact : spring-kafka
<! - https://mvnrepository.com/artifact/org.springframework.kafka/spring-kafka → <dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka</artifactId> <version>x.y.z</version> </dependency>
One thing that you have to keep in mind is that you should make sure the Spring Kafka version is compatible with the Spring versions that are used by SAP CC.
2. Make sure your producers and consumer packages are scanned by your web module.
<!-- Make sure your packages are been scanned by the web module spring config --> <context:component-scan base-package="your.package.com"/>
I would say it will be good practice to load the kafka using the webmodule otherwise, the kafka converters will complain about the ootb SAP CC converters and a lot of compatibility issues will arise.
SAP CC as producer
3. Create a producer configuration
@Configuration public class KafkaProducerConfig { @Bean public ProducerFactory<String, String> producerFactory() { Map<String, Object> configProps = new HashMap<>(); configProps.put( ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress); configProps.put( ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class); configProps.put( ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class); return new DefaultKafkaProducerFactory<>(configProps); } @Bean public KafkaTemplate<String, String> kafkaTemplate() { return new KafkaTemplate<>(producerFactory()); } }
4. Create a producer
@Autowired private KafkaTemplate<String, String> kafkaTemplate; public void sendMessage(String msg) { kafkaTemplate.send(topicName, msg); }
SAP CC as a consumer
5. Create a consumer configuration
@EnableKafka @Configuration public class KafkaConsumerConfig { @Bean public ConsumerFactory<String, String> consumerFactory() { Map<String, Object> props = new HashMap<>(); props.put( ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress); props.put( ConsumerConfig.GROUP_ID_CONFIG, groupId); props.put( ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class); props.put( ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class); return new DefaultKafkaConsumerFactory<>(props); } @Bean public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() { ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>(); factory.setConsumerFactory(consumerFactory()); return factory; } }
6. Create a consumer
@KafkaListener(topics = "${config.topic.name}", groupId = "${config.group.id}") public void consume(@Payload Object payload, Acknowledgment acknowledgment) { //process the payload acknowledgment.acknowledge(); }
Here we are acknowledging back to Kafka that the event(data) is been successfully conceived, but only after processing the event(data). This doesn’t need to be this way. Feel free to change it with your use case.
Additional Info
If you are using the cloud, to start with you can make your consumer active in one aspect. One approach to achieving that is to make sure your web module is active in one node of the cloud, making this through manifest.json
References
Apache Kafka
SAP Commerce Cloud with Kafka
Feel free to reach out for further details on this topic. I’m available to provide additional information and insights. Contact me for any inquiries related to the content discussed in this blog.