spring

Spring Boor Performance

Logging

  • hibernate.show_sql (niet in prod!)
  • hibernate.format_sql
  • hibernate.use_sql_comments

params: - org.hibernate.type.descriptor.sql level trace

P6Spy

pom.xml

<dependency>
    <groupId>p6spy</groupId>
    <artifactId>p6spy</artifactId>
    <version>3.9.1</version> <!-- You can use the latest version -->
</dependency>

application.properties

spring.datasource.driver-class-name=com.p6spy.engine.spy.P6SpyDriver
spring.datasource.url=jdbc:p6spy:mysql://localhost:3306/your_database
spring.datasource.username=your_username
spring.datasource.password=your_password

databaseDialectDateFormat=yyyy-MM-dd'T'HH:mm:ss.SSSZ
customLogMessageFormat=%{currentTime}|%(executionTime)ms|%(category)|connection %(connectionId)|\n%(sqlSingleLine)

modulelist=com.p6spy.engine.outage.P6OutageFactory,com.p6spy.engine.logging.P6LogFactory
logfile=spy.log
logMessageFormat=com.p6spy.engine.spy.appender.SingleLineFormat
appender=com.p6spy.engine.spy.appender.Slf4JLogger
databaseDialectDateFormat=yyyy-MM-dd HH:mm:ss.SSS
•   modulelist: Specifies which P6Spy modules to enable.
•   logfile: Defines where the log file will be created.
•   appender: Determines where logs should be sent, such as a file or SLF4J for logging through a standard logging framework.

Datasource-proxy

@Bean
public DataSource dataSource() {
    SLF4JQueryLoggingListener loggingListener = new SLF4JQueryLoggingListener();
    loggingListener.setQueryLogEntryCreator(new InlineQueryLogEntryCreator());

    return ProxyDataSourceBuilder.create(actualDataSource())
        .name(DATA_SOURCE_PROXY_NAME)
        .listener(loggingListener)
        .build();
}
  • can have own cutom statement execution listeners

FetchSize

statement.setFetchSize(fetchSize) 10 Oracle 120 SQL PostreSQL MySQL whole resultset in single roundtrip

JPA2.2 getResultStream dan wel zetten voor PostgresQL en MySQL

Streams

MySQL streaming

One Record

statement.setFetchSize(Integer.MIN_VALUE)

Multiple Records

statement.setFetchSize(fetchSize)

Oracle

spring.jpa.properties.hibernate.jdbc.fetch_size=50

MySQL Postgres

@Query("""
    select p
    from Post p
    where date(p.createdOn) >= :sinceDate
""")
@QueryHints(
    @QueryHint(name = AvailableHints.HINT_FETCH_SIZE, value = "25")
)
Stream<Post> streamByCreatedOnSince(
    @Param("sinceDate") LocalDate sinceDate
);

Pagination

Data grows per page

  • FETCH FIRST N ROWS ONLY
  • FETCH NEXT N ROWS ONLY
  • OFFSET M ROWS Oracle 12c SQL2012 PostgresQL 8.4

let op ORDER BY

Top-N

SELECT title
FROM post
ORDER BY created_on DESC, id DESC
FETCH FIRST 5 ROWS ONLY

Next-N

SELECT title
FROM post
ORDER BY created_on DESC, id DESC
OFFSET 5 ROWS
FETCH NEXT 5 ROWS ONLY

PostgreSQL MySQL TOP-N

SELECT title
FROM post
ORDER BY created_on DESC, id DESC
LIMIT 5

PostgreSQL MySQL NEXT-N

SELECT title
FROM post
ORDER BY created_on DESC, id DESC
LIMIT 5
OFFSET 5

NB: OFFSET komt NA LIMIT!!!

JPQL Querying -Pagination

Page<Post> findAllByTitle(
    @Param("titlePattern") String titlePattern,
    Pageable pageRequest
);

@Query("""
    select p
    from Post p
    where p.title like :titlePattern
""")
Page<Post> findAllByTitle(
    @Param("titlePattern") String titlePattern,
    Pageable pageRequest
);

JPQL QUERY Pagination Top-N

Page<Post> posts = postRepository.findAllByTitle(
    "High-Performance Java Persistence %",
    PageRequest.of(0, 25, Sort.by("createdOn"))
);
SELECT p.id, p.created_on, p.title
FROM post p
WHERE p.title LIKE 'High-Performance Java Persistence %' ESCAPE ''
ORDER BY p.created_on ASC
OFFSET 0 ROWS
FETCH FIRST 25 ROWS ONLY
@Query(value = """
    SELECT p.id, p.title, p.created_on
    FROM post p
    WHERE p.title ilike :titlePattern
    ORDER BY p.created_on
""",
nativeQuery = true)
Page<Post> findAllByTitleLike(
    @Param("titlePattern") String titlePattern,
    Pageable pageRequest
);

Top-N

Page<Post> posts = postRepository.findAllByTitle(
    "High-Performance Java Persistence %",
    PageRequest.of(0, 25)
);
SELECT p.id, p.title, p.created_on
FROM post p
WHERE p.title ilike 'High-Performance Java Persistence %'
ORDER BY p.created_on
FETCH FIRST 25 ROWS ONLY

Offset pagination index scanning performance

CREATE INDEX idx_post_created_on ON post (created_on DESC, id DESC);

SELECT id
FROM post
ORDER BY created_on DESC
LIMIT 50;
Limit  (cost=0.28..2.51 rows=50 width=16)
  (actual time=0.013..0.022 rows=50 loops=1)
  -> Index Scan using idx_post_created_on on post p
     (cost=0.28..223.28 rows=5000 width=16)
     (actual time=0.013..0.019 rows=50 loops=1)
Planning time: 0.113 ms
Execution time: 0.055 ms

2e en latere scan

SELECT id
FROM post
ORDER BY created_on DESC
LIMIT 50
OFFSET 50;
Limit  (cost=2.51..4.74 rows=50 width=16)
  (actual time=0.032..0.044 rows=50 loops=1)
  -> Index Scan using idx_post_created_on on post p
     (cost=0.28..223.28 rows=5000 width=16)
     (actual time=0.022..0.040 rows=100 loops=1)
Planning time: 0.198 ms
Execution time: 0.071 ms

Nu 100rows!!! Op de laattste pagina wordt alles gescanned... 1.190ms..... OFFSET doesnt seek/traverse

Seek or keyset pagination

SELECT id, created_on
FROM post
ORDER BY created_on DESC, id DESC
LIMIT 50

created_on EN id moeten ERIN

Next-N

SELECT id, created_on
FROM post
WHERE (created_on, id) < ('2024-10-02 21:00:00.0', 4951)
ORDER BY created_on DESC, id DESC
LIMIT 50;
•   The row value expression (a, b) < (c, d) is PostgreSQL and MySQL.
•   It’s equivalent to a < c | (a = c & b < d).

Nu wel max 50 results in set

Spring Data JPA - WindowIterator

Oplossing!

WindowIterator<PostComment> commentWindowIterator = WindowIterator.of(
    position -> postCommentRepository.findByPost(
        post,
        PageRequest.of(
            0, pageSize,
            Sort.by(
                Sort.Order.desc(PostComment_.CREATED_ON),
                Sort.Order.desc(PostComment_.ID)
            )
        ),
        position
    )
).startingAt(ScrollPosition.keyset());

commentWindowIterator.forEachRemaining(this::processPostComment);

met Blaze Persistance Top-N

// Blaze Persistence – Keyset pagination
PagedList<Post> firstPage = cbf
    .create(entityManager, Post.class)
    .orderByAsc(Post_.CREATED_ON).orderByAsc(Post_.ID)
    .page(0, pageSize)
    .withKeysetExtraction(true)
    .getResultList();
SELECT p.id, p.created_on, p.title,
       (SELECT count(*) FROM post)
FROM post p
ORDER BY p.created_on, p.id
OFFSET 0
ROWS FETCH FIRST 25 ROWS ONLY;

Next N

// Blaze Persistence – Keyset pagination
PagedList<Post> nextPage = cbf
    .create(entityManager, Post.class)
    .orderByAsc(Post_.CREATED_ON).orderByAsc(Post_.ID)
    .page(postPage.getKeysetPage(),
          postPage.getPage() * postPage.getMaxResults(),
          postPage.getMaxResults())
    .getResultList();
SELECT p.id, p.created_on, p.title,
       (SELECT count(*) FROM post)
FROM post p
WHERE ('2024-09-09 12:10:00.0', 10) < (p.created_on, p.id)
ORDER BY p.created_on, p.id
OFFSET 0
ROWS FETCH FIRST 25 ROWS ONLY;

Projections

Fetching too many columns

Instead of fetching all columns:

SELECT *  
FROM post_comment pc  
LEFT JOIN post p ON p.id = pc.post_id  
LEFT JOIN post_details pd ON p.id = pd.id  


Fetch a custom SQL projection:


SELECT pc.id, pc.review FROM post_comment pc LEFT JOIN post p ON p.id = pc.post_id LEFT JOIN post_details pd ON p.id = pd.id

(Joins zijn niet nodig... => dus tweede is dan sneller)

Tuple projection

not type save

• The JPA Tuple wraps the default Object[] projection and allows you to retrieve the column values via their aliases.

List<Tuple> commentTuples = postRepository.findAllCommentTuplesByPostTitle(titleToken);
Tuple commentTuple = commentTuples.get(0);

long id = commentTuple.get("id", Number.class).longValue();
String title = commentTuple.get("title", String.class);

Interface-based projection

• If you want a type-safe projection, Spring Data JPA provides the option of wrapping the result in a Proxy based on a given interface.

The Interface-based projection can be used like this:

@Query("""
select
p.id as id,
p.title as title,
c.review as review
from PostComment c
join c.post p
where p.title like :postTitle
order by c.id
""")
List<PostCommentSummary> findAllCommentSummariesByPostTitle(
@Param("postTitle") String postTitle
);

// nu type safe!       
Long id = commentSummary.getId();
String title = commentSummary.getTitle();

Met ingebakken (!) DTOs en Records

DTOs

properties.put(
    "hibernate.integrator_provider",  // Hibernate property for custom integrator
    (IntegratorProvider) () -> Collections.singletonList( // Lambda to provide integrator
        new ClassImportIntegrator( // Hypersistence Utils integrator
            List.of( // List of classes to register
                PostCommentDTO.class,  // First DTO class
                PostCommentRecord.class // Second DTO class
            )
        )
    )
);

String jpql = "SELECT new PostCommentDTO(pc.id, pc.comment) FROM PostComment pc WHERE pc.post.id = :postId";


Records

public record PostCommentDTO(Long id, String comment) {}
public record PostCommentRecord(Long id, String content) {}

properties.put(
    "hibernate.integrator_provider",
    (IntegratorProvider) () -> Collections.singletonList(
        new ClassImportIntegrator(
            List.of(
                PostCommentDTO.class,   // Register the record as a DTO
                PostCommentRecord.class // Register additional records as needed
            )
        )
    )
);

@Query(
    value = "SELECT pc.id, pc.comment FROM post_comment pc WHERE pc.post_id = :postId",
    nativeQuery = true
)
List<PostCommentDTO> findCommentsByPostId(@Param("postId") Long postId);

Long id = commentRecord.id();
String title = commentRecord.title();

TupleTransformer and ResultListTransformer

The SQL projection can contain data from multiple tables, like the post and post_comment, which form a one-to-many table relationship.

SELECT p.id AS p_id,
p.title AS p_title,
pc.id AS pc_id,
pc.review AS pc_review
FROM post p
JOIN post_comment pc ON p.id = pc.post_id
ORDER BY pc.id

komt uit:

p_id p_title pc_id pc_review
1 High-Performance Java Persistence 1 Best book on JPA and Hibernate!
1 High-Performance Java Persistence 2 A must-read for every Java developer!
2 Hypersistence Optimizer 3 It's like pair programming with Vlad!

Spring snips

Docker

local

docker run -v ./app.jar:/app/app.jar \
-e DB_USERNAME='admin' -e DB_PASSWORD='password123' \
-p 8080:8080 \
amazoncorretto:21 \
java -jar /app/app.jar --spring.profiles.active=prod

start met .properties file

DB_USERNAME=user java -jar target/app.jar --spring.profiles.active=prod

Application properties

in application.prod.properties

spring.application.version=@project.version@

// h2
spring.datasource.username=${DB_USERNAME}
spring.datasource.password=${DB_PASSWORD}
spring.datasource.url=jdbc:h2:mem:contactdb
spring.h2.console.enabled=true

// MySQL
spring.datasource.url=jdbc:mysql://localhost:3306/tempdb?useUnicode=true&useLegacyDatetimeCode=false&serverTimezone=UTC&createDatabaseIfNotExist=true&allowPublicKeyRetrieval=true&useSSL=false
spring.datasource.username=root
spring.datasource.password=<YOURPASS>
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver

# Hibernate JPA settings
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.MySQLDialect
spring.jpa.hibernate.ddl-auto=update
spring.jpa.show-sql=true

# Logging level for debugging
logging.level.org.hibernate=DEBUG

logging.level.root=debug

actuator

voor in dev:

management.endpoints.web.exposure.include=*
management.endpoint.health.show-details=always
management.server.port=9090

Intellij Endpoint POST Request

POST http://localhost:8080/players
Accept: */*
Accept-Encoding: gzip, deflate
Content-Type: application/json
Accept-Language: en-us

{
 "role" : "value3",
 "name": "value4"
}

Lombok

Log4j2 Pattern Layout Alternative

configure Log4j2 to automatically include the class and method names in log messages by setting up a custom pattern in log4j2.xml:

<PatternLayout pattern="%d{yyyy-MM-dd HH:mm:ss} %-5p [%t] %C{1}.%M - %msg%n"/>

Swagger

`` org.springdoc springdoc-openapi-starter-webmvc-ui 2.6.0 ```

http://localhost:8080/swagger-ui/index.html

en json format
http://localhost:8080/v3/api-docs

Profiles

in yaml -- maakt er feitelijk 2 yaml docs van

in .properties:

spring.profiles.active=sql
spring.config.activate.on-profile=sql

Kafka

https://kafka.apache.org/quickstart

// download
10015  tar -xzf kafka_2.13-3.8.0.tgz
// unpack and start
10016  cd kafka_2.13-3.8.0
10019  cd ~/Downloads/kafka_2.13-3.8.0
10020  bin/kafka-server-start.sh config/server.properties
10022  bin/kafka-topics.sh --create --topic quickstart-events --bootstrap-server localhost:9092
10023  bin/kafka-topics.sh --describe --topic quickstart-events --bootstrap-server localhost:9092
10024  bin/kafka-console-producer.sh --topic quickstart-events --bootstrap-server localhost:9092
10026  bin/kafka-console-consumer.sh --topic quickstart-events --from-beginning --bootstrap-server localhost:9092
10027  bin/kafka-server-start.sh config/server.properties
// consume
10029  bin/kafka-console-producer.sh --topic quickstart-events --bootstrap-server localhost:9092
10030  bin/kafka-console-consumer.sh --topic quickstart-events --from-beginning --bootstrap-server localhost:9092
10032  bin/kafka-topics.sh --describe --topic cab-location --bootstrap-server localhost:9092
10033  bin/kafka-console-consumer.sh --topic cab-location --from-beginning --bootstrap-server localhost:9092

application properties Driver

spring.application.name=kafkaBookingDriver
spring.kafka.producer.bootstrap-servers=localhost:9092
spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer
server.port=8082

application properties User

spring.application.name=kafkaBookingUser

spring.kafka.consumer.bootstrap-servers=localhost:9092
spring.kafka.consumer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
spring.kafka.consumer.value-serializer=org.apache.kafka.common.serialization.StringSerializer
spring.kafka.consumer.auto-offset-reset=earliest
server.port=8081

spring.kafka.consumer.group-id=user-group

Driver

needs: - config

  • Controller

  • service

config:

package nl.appall.java.spring.kafkabookingdriver.config;

import org.apache.kafka.clients.admin.NewTopic;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.config.TopicBuilder;

import static nl.appall.java.spring.kafkabookingdriver.constant.AppConstant.CAB_LOCATION;

@Configuration
public class KafkaConfig {


    @Bean
    public NewTopic topic() {
        return TopicBuilder
                .name(CAB_LOCATION)
                .build();
    }
}

controller:

package nl.appall.java.spring.kafkabookingdriver.controller;

import nl.appall.java.spring.kafkabookingdriver.service.CabLocationService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.PutMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;

import java.util.Map;

@RestController
@RequestMapping("/location")
public class CabLocationController {

    @Autowired private CabLocationService cabLocationService;

    @PutMapping
    public ResponseEntity updateLaction() throws InterruptedException {

        int range = 100;
        while(range > 0) {
            cabLocationService.updateLocation(Math.random() + " , "+ Math.random());
            Thread.sleep(1000);
            range--;
        }
        return new ResponseEntity<>(Map.of("message","Location Updated"), HttpStatus.OK);
    }
}

service:

package nl.appall.java.spring.kafkabookingdriver.service;

import nl.appall.java.spring.kafkabookingdriver.constant.AppConstant;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Service;

@Service
public class CabLocationService {

    @Autowired private KafkaTemplate<String,Object> kafkaTemplate;

    public boolean updateLocation(String location){
        kafkaTemplate.send(AppConstant.CAB_LOCATION, location);
        return true;
    }
}

User

needs only service

package nl.appall.java.spring.kafkabookinguser;

import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.stereotype.Service;

@Service
public class LocationService {

    @KafkaListener(
            topics = "cab-location",
            groupId = "user-group"
    )
    public void cabLocation(String location){
        System.out.println(location);

    }
}

Tags: 

Spring + Angular in one jar

based on:
Webapp with Create React App

  • create spring app
  • prevent cors
@Bean
    public WebMvcConfigurer corsConfigurer() {
        return new WebMvcConfigurer() {
            @Override
            public void addCorsMappings(CorsRegistry registry) {
                registry.addMapping("/**")
                        .allowedOrigins("http://localhost:4200")
                        .allowedHeaders("application/json", "text/plain", "*/*",
                                "Access-Control-Allow-Headers",
                                "Access-Control-Allow-Origin",
                                "Origin", "Accept", "X-Requested-With, Content-Type",
                                "Access-Control-Request-Method",
                                "Access-Control-Request-Headers")
                        .allowedMethods("GET", "PUT", "POST", "PATCH", "DELETE", "OPTIONS")
                        .maxAge(3600)
                ;
            }
        };

  • in root create new ng app
  • create proxy.conf.json in root frontend app
{
  "/api/*": {
    "target": "http://localhost:8080",
    "secure": false,
    "logLevel": "debug",
    "changeOrigin": true
  }
}
  • change angular.json:
            "development": {
              "browserTarget": "frontend:build:development",
              "proxyConfig": "proxy.conf.json"
            }
  • add plugin for building the frontend:
    note: version!!!
    note: npmVersion!!!
    note: nodeVersion!!!
<plugin>
                <groupId>com.github.eirslett</groupId>
                <artifactId>frontend-maven-plugin</artifactId>
                <version>1.9.1</version>
                <configuration>
                    <workingDirectory>frontend</workingDirectory>
                    <installDirectory>target</installDirectory>
                </configuration>
                <executions>
                    <execution>
                        <id>install node and npm</id>
                        <goals>
                            <goal>install-node-and-npm</goal>
                        </goals>
                        <configuration>
                            <nodeVersion>v14.15.4</nodeVersion>
                            <npmVersion>7.17.0</npmVersion>
                        </configuration>
                    </execution>
                    <execution>
                        <id>npm install</id>
                        <goals>
                            <goal>npm</goal>
                        </goals>
                        <configuration>
                            <arguments>install</arguments>
                        </configuration>
                    </execution>
                    <execution>
                        <id>npm run build</id>
                        <goals>
                            <goal>npm</goal>
                        </goals>
                        <configuration>
                            <arguments>run build</arguments>
                        </configuration>
                    </execution>
                </executions>
            </plugin>

  • copy the dist version to target:
    note double check location where the build is located
    note mvn clean package
<plugin>
                <artifactId>maven-antrun-plugin</artifactId>
                <executions>
                    <execution>
                        <phase>generate-resources</phase>
                        <configuration>
                            <target>
                                <copy todir="${project.build.directory}/classes/public">
                                    <fileset dir="${project.basedir}/frontend/dist"/>
                                </copy>
                            </target>
                        </configuration>
                        <goals>
                            <goal>run</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>

Spring Mysql One to Many....

pom.xml
- lombok
- spring-boot-starter-data-jpa
- mysql-connector-java

application.properties

# MySQL connection properties
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
spring.datasource.username=username
spring.datasource.password=password
spring.datasource.url=jdbc:mysql://localhost:3306/testspring

# Log JPA queries
# Comment this in production
spring.jpa.show-sql=true

# Drop and create new tables (create, create-drop, validate, update)
# Only for testing purpose - comment this in production
spring.jpa.hibernate.ddl-auto=create-drop

# Hibernate SQL dialect
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.MySQL5InnoDBDialect

ResApplication.java

public class RestApplication {

    public static void main(String[] args) {
        SpringApplication.run(RestApplication.class, args);
    }

    @Bean
    public CommandLineRunner mappingDemo(BookRepository bookRepository,
                                         PageRepository pageRepository) {
        return args -> {

            // create a new book
            Book book = new Book("Java 101", "John Doe", "123456");

            // save the book
            bookRepository.save(book);
            pageRepository.save(new Page(65, "Java 8 contents", "Java 8", book));
            pageRepository.save(new Page(95, "Concurrency contents", "Concurrency", book));
        };
    }
}

Book.java


@Entity @Table(name = "books") @Getter @Setter public class Book implements Serializable { @Id @GeneratedValue(strategy = GenerationType.IDENTITY) private Long id; private String title; private String author; @Column(unique = true) private String isbn; ```**```@JsonManagedReference```**``` @OneToMany(mappedBy = "book", fetch = FetchType.LAZY, cascade = CascadeType.ALL) private List<Page> pages = new ArrayList<Page>(); public Book() { } public Book(String title, String author, String isbn) { this.title = title; this.author = author; this.isbn = isbn; } // getters and setters, equals(), toString() .... (omitted for brevity) @Override public String toString() { return "Book{" + "id=" + id + ", title='" + title + '\'' + ", author='" + author + '\'' + ", isbn='" + isbn + '\'' + ", number of pages=" + pages.size() + '}'; } }

Pages.java


@Entity @Table(name = "pages") @Setter @Getter public class Page implements Serializable { @Id @GeneratedValue(strategy = GenerationType.IDENTITY) private Long id; private int number; private String content; private String chapter; @JsonBackReference @ManyToOne(fetch = FetchType.LAZY, optional = false) @JoinColumn(name = "book_id", nullable = false) private Book book; public Page() { } public Page(int number, String content, String chapter, Book book) { this.number = number; this.content = content; this.chapter = chapter; this.book = book; } // getters and setters, equals(), toString() .... (omitted for brevity) @Override public String toString() { return "Page{" + "id=" + id + ", number=" + number + ", content='" + content + '\'' + ", chapter='" + chapter + '\'' + ", book=" + book.toString() + '}'; } }

BookRepository.java

public interface BookRepository extends CrudRepository<Book, Long> {

    Book findByIsbn(String isbn);
}

PageRepository.java

public interface PageRepository extends CrudRepository<Page, Long> {

    List<Page> findByBook(Book book, Sort sort);
}

BooksController.java


@RestController @RequestMapping(value = "/books", produces = MediaType.APPLICATION_JSON_VALUE) public class BooksController { private final BookRepository bookRepository; private final PageRepository pageRepository; public BooksController(BookRepository bookRepository, PageRepository pageRepository) { this.bookRepository = bookRepository; this.pageRepository = pageRepository; } @GetMapping(value = "/pages", produces = MediaType.APPLICATION_JSON_VALUE) @ResponseStatus(HttpStatus.OK) public List<Page> pages(){ List<Page> result = (List<Page>) pageRepository.findAll(); System.out.println(result); System.out.println(result.get(0)); return result; } @GetMapping("/") public List<Book> books(){ return (List<Book>)bookRepository.findAll(); } }

http://localhost:8080/books/

[
{
"id": 1,
"title": "Java 101",
"author": "John Doe",
"isbn": "123456",
"pages": [
{
"id": 1,
"number": 1,
"content": "Introduction contents",
"chapter": "Introduction"
},
{
"id": 2,
"number": 65,
"content": "Java 8 contents",
"chapter": "Java 8"
},
{
"id": 3,
"number": 95,
"content": "Concurrency contents",
"chapter": "Concurrency"
}
]
}
]

http://localhost:8080/books/pages

[
{
"id": 1,
"number": 1,
"content": "Introduction contents",
"chapter": "Introduction"
},
{
"id": 2,
"number": 65,
"content": "Java 8 contents",
"chapter": "Java 8"
},
{
"id": 3,
"number": 95,
"content": "Concurrency contents",
"chapter": "Concurrency"
}
]
Subscribe to RSS - spring