MongoDB to CSV Spring Boot Batch Example

MongoDB to CSV Spring Boot Batch Example | Previously we have seen Spring Boot Batch examples of CSV to MySQL, CSV to MongoDB, and MySQL to CSV. Now let us see MongoDB to CSV file through Spring Boot Batch. Also see:- Spring Boot Batch API Introduction, MySQL to XML Spring Boot Batch Example

For MongoDB:-

  • MongoPagingItemReader is the implementation class of the ItemReader interface.
  • MongoPagingItemReader takes the following inputs:-
    • MongoTemplate
    • Collection name/target type
    • Projection/where condition query (optional)
    • Sorting details (optional)

MongoDB to CSV Spring Boot Batch

Create a Spring starter project with the following dependencies:- Lombok, Spring Batch, Spring Data MongoDB, H2 Database, and Spring web.

Different Classes/files:-

  • Model class
  • Batch Config
  • Properties file
  • Controller class

Setup data in MongoDB:-

db.user.insertMany([
    {"userId": 101, "userName": "John", "userRole": "ADMIN", "userDept": "DEV"},
    {"userId": 102, "userName": "Jane", "userRole": "MGR", "userDept": "QA"},
    {"userId": 103, "userName": "Mike", "userRole": "SE", "userDept": "DEV"},
    {"userId": 104, "userName": "Alice", "userRole": "SEQ", "userDept": "QA"},
    {"userId": 105, "userName": "Bob", "userRole": "DER", "userDept": "QA"}
]);

In application.properties file:-

spring.data.mongodb.host=localhost
spring.data.mongodb.port=27017
spring.data.mongodb.database=sample
spring.batch.job.enabled=false

Model class:-

package com.knowprogram.demo.model;

import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;

@Data
@AllArgsConstructor
@NoArgsConstructor
public class User {
    private Integer userId;
    private String userName;
    private String userRole;
    private String userDept;
}

Batch Config class:-

@Configuration
public class BatchConfig {

    @Autowired
    private MongoTemplate mongoTemplate;

    @Bean
    ItemReader<User> reader() {
        MongoPagingItemReader<User> reader = new MongoPagingItemReader<>();
        reader.setTemplate(mongoTemplate);
        reader.setTargetType(User.class);
        reader.setCollection("user");
        // reader.setQuery("{eid: {$lt: 10}}");
        reader.setQuery("{ }");
        reader.setSort(new HashMap<String, Direction>() {
            private static final long serialVersionUID = 1L;
            {
                put("_id", Direction.DESC);
            }
        });
        return reader;
    }

    @Bean
    ItemProcessor<User, User> processor() {
        return item -> item;
    }

    @Bean
    ItemWriter<User> writer() {
        FlatFileItemWriter<User> writer = new FlatFileItemWriter<>();
        writer.setResource(new FileSystemResource("V:/myouts/usersmongodb.csv"));
        writer.setLineAggregator(new DelimitedLineAggregator<>() {
            {
                setDelimiter(",");
                setFieldExtractor(new BeanWrapperFieldExtractor<>() {
                    {
                        setNames(new String[] { "userId", "userName", 
                                        "userRole", "userDept" }
                        );
                    }
                });
            }
        });
        return writer;
    }
    

    @Bean
    Step step(JobRepository repository, PlatformTransactionManager transactionManager) {
        return new StepBuilder("csv-step", repository)
                .<Product, Product>chunk(10, transactionManager)
                .reader(reader())
                .processor(processor()).writer(writer())
                .taskExecutor(new SimpleAsyncTaskExecutor() {
                    private static final long serialVersionUID = 1L;
                    {
                        setConcurrencyLimit(10);
                    }
                })
                .build();
    }

    @Bean
    JobExecutionListener listener() {
        return new JobExecutionListener() {
            @Override
            public void beforeJob(JobExecution jobExecution) {
                System.out.println("MyJobListener.beforeJob()");
            }

            @Override
            public void afterJob(JobExecution jobExecution) {
                System.out.println("MyJobListener.afterJob()");
            }
        };
    }

    @Bean(name = "csvJob")
    Job job(JobRepository jobRepository, PlatformTransactionManager transactionManager) {
        return new JobBuilder("csv-job", jobRepository)
                .listener(listener())
                .flow(step(jobRepository, transactionManager))
                .end()
                .build();
    }
}

Controller class:-

@RestController
package com.knowprogram.demo.controller;

public class ProductBatchController {

    @Autowired
    private JobLauncher jobLauncher;

    @Autowired
    private Job job;

    @GetMapping("/startBatch")
    public String startBatch() throws JobExecutionAlreadyRunningException, 
        JobRestartException,
        JobInstanceAlreadyCompleteException, 
        JobParametersInvalidException {
        JobParameters params = new JobParametersBuilder()
             .addLong("time", System.currentTimeMillis())
             .toJobParameters();

        JobExecution run = jobLauncher.run(job, params);
        return run.getStatus().toString();
    }
}

After running the application and calling the API it generates usersmongodb.csv file with the following data:-

If you enjoyed this post, share it with your friends. Do you want to share more information about the topic discussed above or do you find anything incorrect? Let us know in the comments. Thank you!

Leave a Comment

Your email address will not be published. Required fields are marked *