[WebSocket][Django][Vite][Docker] Issue with WebSocket Routes in Development Mode, while it works in production

Hello everyone,

I am encountering an issue when running my application in development mode where all WebSocket routes fail to work. This problem does not occur in production mode when using Nginx.

Below is a screenshot of the error:

The application is running with Daphne for Django Rest Framework (DRF) and Vite for React. Both the backend and frontend are running in Docker containers.

To start the application, I use the following command:

docker compose -f compose.yml -f [dev/prod].yml up --build

Django is started with the command: daphne -b 0.0.0.0 -p 8000 backend.asgi:application

Here is my code:

compose.yml:

services:
  backend:
    build:
      context: ./backend
      dockerfile: Dockerfile
    container_name: backend
    image: cc_guardian_backend
    volumes:
      - ./backend/database:/app/backend/database
    networks:
      - cc_guardian_net
    environment:
      - ENV_MODE=default

  frontend:
    build:
      context: ./frontend
      dockerfile: Dockerfile
    container_name: frontend
    image: cc_guardian_frontend
    networks:
      - cc_guardian_net
    environment:
      - ENV_MODE=default

networks:
  cc_guardian_net:
    driver: bridge

volumes:
  database:

dev.yml:

services:
  backend:
    environment:
      - ENV_MODE=dev
    ports:
      - 8000:8000
    command: sh /app/entrypoint.sh dev
    volumes:
      - ./backend:/app # Hot-reloading

  frontend:
    volumes:
      - ./frontend:/app # Hot-reloading
      - /app/node_modules # Keeps node modules
    ports:
      - 3000:3000
    environment:
      - ENV_MODE=dev
    command: sh -c "./update_env.sh && npm run dev"

volumes:
  backend:
  frontend:

prod.yml:

services:
  backend:
    environment:
      - ENV_MODE=prod
    ports:
      - 8000:8000
    command: sh /app/entrypoint.sh prod
    volumes:
      - ./backend/static:/app/static
    depends_on:
      - redis
    restart: unless-stopped

  frontend:
    volumes:
      - ./frontend/dist:/app/dist # share static files
    environment:
      - ENV_MODE=prod
    command: sh -c "./update_env.sh && npm run build"

  redis:
    image: "redis:alpine"
    container_name: redis
    networks:
      - cc_guardian_net
    volumes:
      - ./redis.conf:/usr/local/etc/redis/redis.conf
    command: redis-server /usr/local/etc/redis/redis.conf
    ulimits:
      nofile:
        soft: 65535
        hard: 65535

  nginx:
    build:
      context: ./nginx
    container_name: nginx
    volumes:
      - ./frontend/dist:/app/dist
      - ./backend/static:/app/static
    ports:
      - 80:80
      - 443:443
    depends_on:
      - frontend
      - backend
    networks:
      - cc_guardian_net
    restart: unless-stopped

  redis:
    restart: unless-stopped

entrypoint.sh:

#!/bin/bash

# Run migrations
python3 manage.py makemigrations --noinput
python3 manage.py migrate --noinput

if [ "$1" = "prod" ]; then
    python3 manage.py collectstatic --noinput
fi

daphne -b 0.0.0.0 -p 8000 backend.asgi:application

Django settings:

INSTALLED_APPS = [
    "daphne",
    "channels",
    "django.contrib.admin",
    "django.contrib.auth",
    "django.contrib.contenttypes",
    "django.contrib.sessions",
    "django.contrib.messages",
    "django.contrib.staticfiles",
    "django_extensions",
    "rest_framework",
    "corsheaders",
    "api",
    "user_api",
]

REST_FRAMEWORK = {
    "DEFAULT_AUTHENTICATION_CLASSES": [
        "rest_framework.authentication.SessionAuthentication",
    ],
    "DEFAULT_PERMISSION_CLASSES": [
        "rest_framework.permissions.IsAuthenticated",
    ],
}

SESSION_ENGINE = (
    "django.contrib.sessions.backends.db"  # Store the sessions in the database
)

MIDDLEWARE = [
    "django.middleware.security.SecurityMiddleware",
    "corsheaders.middleware.CorsMiddleware",
    "django.contrib.sessions.middleware.SessionMiddleware",
    "django.middleware.common.CommonMiddleware",
    "django.middleware.csrf.CsrfViewMiddleware",
    "django.contrib.auth.middleware.AuthenticationMiddleware",
    "django.contrib.messages.middleware.MessageMiddleware",
    "django.middleware.clickjacking.XFrameOptionsMiddleware",
]

ROOT_URLCONF = "backend.urls"

TEMPLATES = [
    {
        "BACKEND": "django.template.backends.django.DjangoTemplates",
        "DIRS": [],
        "APP_DIRS": True,
        "OPTIONS": {
            "context_processors": [
                "django.template.context_processors.debug",
                "django.template.context_processors.request",
                "django.contrib.auth.context_processors.auth",
                "django.contrib.messages.context_processors.messages",
            ],
        },
    },
]

# WSGI_APPLICATION = "backend.wsgi.application"
ASGI_APPLICATION = "backend.asgi.application"


# Database
# https://docs.djangoproject.com/en/5.1/ref/settings/#databases

DATABASES = {
    "default": {
        "ENGINE": "django.db.backends.sqlite3",
        "NAME": BASE_DIR / "database" / "db.sqlite3",
    }
}


# Password validation
# https://docs.djangoproject.com/en/5.1/ref/settings/#auth-password-validators

AUTH_PASSWORD_VALIDATORS = [
    {
        "NAME": "django.contrib.auth.password_validation.UserAttributeSimilarityValidator",
    },
    {
        "NAME": "django.contrib.auth.password_validation.MinimumLengthValidator",
    },
    {
        "NAME": "django.contrib.auth.password_validation.CommonPasswordValidator",
    },
    {
        "NAME": "django.contrib.auth.password_validation.NumericPasswordValidator",
    },
]

AUTH_USER_MODEL = "user_api.AppUser"  # Custom User model

# Internationalization
# https://docs.djangoproject.com/en/5.1/topics/i18n/

LANGUAGE_CODE = "en-us"

TIME_ZONE = "UTC"

USE_I18N = True

USE_TZ = True


# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/5.1/howto/static-files/

STATIC_URL = "/static/"
STATIC_ROOT = os.path.join(BASE_DIR, "static")

# Default primary key field type
# https://docs.djangoproject.com/en/5.1/ref/settings/#default-auto-field

DEFAULT_AUTO_FIELD = "django.db.models.BigAutoField"

Dev settings.py:

CORS_ALLOW_CREDENTIALS = True
DEBUG = True
ALLOWED_HOSTS = ["*"]

CHANNEL_LAYERS = {
    "default": {
        "BACKEND": "channels.layers.InMemoryChannelLayer",
    },
}

CSRF_TRUSTED_ORIGINS = [
    "http://127.0.0.1:3000",
    "http://127.0.0.1:8000",
    "http://localhost:3000",
    "http://localhost:8000",
]

CORS_ALLOWED_ORIGINS = [
    "http://localhost:3000",
    "http://127.0.0.1:3000",
    "http://localhost:8000",
    "http://127.0.0.1:8000",
]

CORS_ALLOW_ALL_ORIGINS = True

CORS_ALLOW_METHODS = [
    'DELETE',
    'GET',
    'OPTIONS',
    'PATCH',
    'POST',
    'PUT',
]

CORS_ALLOW_HEADERS = [
    'accept',
    'accept-encoding',
    'authorization',
    'content-type',
    'dnt',
    'origin',
    'user-agent',
    'x-csrftoken',
    'x-requested-with',
]

consumers.py

class ServerUpdateStatusConsumer(WebsocketConsumer):
    def connect(self):
        print(f"WebSocket connect: {self.scope['client']}")
        self.server_hostname = self.scope["url_route"]["kwargs"].get(
            "server_hostname", None
        )

        if self.server_hostname:
            # Specific group to each server (send to environment page)
            self.room_group_name = f"server_update_{self.server_hostname}"
        else:
            # General group for all the servers (send to dashboard page)
            self.room_group_name = "server_update_general"

        # Add the consumer to the group
        async_to_sync(self.channel_layer.group_add)(
            self.room_group_name, self.channel_name
        )

        self.accept()

    def disconnect(self, close_code):
        # Remove the consumer when disconnect
        async_to_sync(self.channel_layer.group_discard)(
            self.room_group_name, self.channel_name
        )

    def receive(self, text_data):
        # Can be used for frontend response
        pass

    # This method handles the server_status_update message
    def server_status_update(self, event):
        # Send the data to the WebSocket
        self.send(
            text_data=json.dumps(
                {
                    "hostname": event["hostname"],
                    "key": event["key"],
                    "status": event["status"],
                    "child_hostname": event.get(
                        "child_hostname", None
                    ),  # Include child_hostname if available
                }
            )
        )

asgi.py

import os

from django.core.asgi import get_asgi_application
from channels.security.websocket import AllowedHostsOriginValidator
from channels.routing import ProtocolTypeRouter, URLRouter
from channels.auth import AuthMiddlewareStack
from .routing import websocket_urlpatterns

os.environ.setdefault("DJANGO_SETTINGS_MODULE", "backend.settings")

application = ProtocolTypeRouter(
    {
        "http": get_asgi_application(),
        "websocket": AllowedHostsOriginValidator(
            AuthMiddlewareStack(URLRouter(websocket_urlpatterns))
        ),
    }
)

routing.py:

from django.urls import re_path
from .consumers import LogsConsumer, ServerUpdateStatusConsumer

websocket_urlpatterns = [
    re_path(r"ws/logs/(?P<server_hostname>[\w-]+)/$", LogsConsumer.as_asgi()),
    re_path(r"ws/server-update-status/$", ServerUpdateStatusConsumer.as_asgi()),
    re_path(
        r"ws/server-update-status/(?P<server_hostname>[\w-]+)/$",
        ServerUpdateStatusConsumer.as_asgi(),
    ),
]

Javascript Code:

import React, { useEffect, useState } from 'react';
import api from '../api';
import Card from '../components/Card';
import { Box, Grid, Typography, CircularProgress } from '@mui/material';

const TruncatedTitle = (title) => {
    return title.length > 49 ? `${title.slice(0, 46)}...` : title;
};

const Dashboard = () => {
    const [servers, setServers] = useState([]);
    const [isLoading, setIsLoading] = useState(true);
    const wsUrl = import.meta.env.VITE_API_BASE_URL_WS;

    // Load server data
    useEffect(() => {
        const fetchServers = async () => {
            try {
                const response = await api.get(`/api/user-linked-servers/`);
                setServers(response.data);
            } catch (error) {
                console.error('Error searching servers:', error);
            } finally {
                setIsLoading(false);
            }
        };

        fetchServers();

        const ws = new WebSocket(`${wsUrl}/ws/server-update-status/`);
        ws.onopen = (event) => {
            console.log('WebSocket connection established');
        };

        ws.onerror = (error) => {
            console.error('WebSocket error:', error);
        };

        ws.onclose = (event) => {
            console.log('WebSocket connection closed:', event);
        };

        ws.onmessage = (event) => {
            try {
                const updatedData = JSON.parse(event.data);
                console.log('Received WebSocket message:', data);
                const { hostname, key, status, child_hostname } = updatedData;

                setServers((prevServers) =>
                    prevServers.map((server) => {
                        if (server.hostname === hostname) {
                            switch (key) {
                                case 'application_status':
                                case 'queues_status':
                                case 'event_viewer_status':
                                case 'services_status':
                                    return {
                                        ...server,
                                        [key]: { status: status }
                                    };
                                case 'kafka':
                                case 'databases':
                                case 'sbs':
                                case 'nms':
                                case 'collectors':
                                    return {
                                        ...server,
                                        [key]: server[key].map((item) => {
                                            return item.hostname.toLowerCase() ===
                                                child_hostname.toLowerCase()
                                                ? { ...item, status: status }
                                                : item;
                                        })
                                    };
                                default:
                                    return server;
                            }
                        }
                        return server;
                    })
                );
            } catch (error) {
                console.error('Error processing WebSocket message:', error);
            }
        };

        return () => {
            ws.close();
        };
    }, []);

api.js

import axios from 'axios';

function getCookie(name) {
    let cookieValue = null;
    if (document.cookie && document.cookie !== '') {
        const cookies = document.cookie.split(';');
        for (let i = 0; i < cookies.length; i++) {
            const cookie = cookies[i].trim();
            if (cookie.substring(0, name.length + 1) === name + '=') {
                cookieValue = decodeURIComponent(
                    cookie.substring(name.length + 1)
                );
                break;
            }
        }
    }
    return cookieValue;
}

const csrfToken = getCookie('csrftoken');
axios.defaults.xsrfCookieName = 'csrftoken';
axios.defaults.xsrfHeaderName = 'X-CSRFToken';
axios.defaults.withCredentials = true;

const api = axios.create({
    baseURL: import.meta.env.VITE_API_BASE_URL || 'http://127.0.0.1:8000',
    headers: {
        'X-CSRFToken': csrfToken,
        'Content-Type': 'application/json',
        Accept: 'application/json'
    }
});

export default api;

vite.config.js

import { defineConfig } from 'vite';
import react from '@vitejs/plugin-react';

export default defineConfig({
  server: {
    host: '0.0.0.0',
    port: 3000,
    },
  plugins: [react()],
});


Welcome @lucaslopes !

What do your docker logs, daphne logs, and syslog show for these connection attempts?

What does your /app/entrypoint.sh script look like?

What happens if you run this environment natively, and not within docker containers?

Thanks for your response, Ken!

  • I updated the post with the entrypoint.sh code.
  • The same error occurs when running this environment natively.
  • It’s worth mentioning that I’m running the project on WSL.
  • The requested logs are below

Docker Logs:

❯ docker compose -f compose.yml -f dev.yml up --build
[+] Building 162.3s (27/27) FINISHED                                            docker:default
 => [frontend internal] load build definition from Dockerfile                             0.0s
 => => transferring dockerfile: 388B                                                      0.0s
 => [backend internal] load build definition from Dockerfile                              0.0s
 => => transferring dockerfile: 391B                                                      0.0s
 => [frontend internal] load metadata for docker.io/library/node:22-alpine                1.2s
 => [backend internal] load metadata for docker.io/library/python:3.13.0-slim             1.3s
 => [frontend internal] load .dockerignore                                                0.0s
 => => transferring context: 2B                                                           0.0s
 => [frontend internal] load build context                                                2.5s
 => => transferring context: 84.37MB                                                      2.4s
 => [frontend 1/7] FROM docker.io/library/node:22-alpine@sha256:b64ced2e7cd0a4816699fe30  0.0s
 => [backend internal] load .dockerignore                                                 0.0s
 => => transferring context: 2B                                                           0.0s
 => [backend 1/8] FROM docker.io/library/python:3.13.0-slim@sha256:eda73cafdf051c7618537  0.0s
 => => resolve docker.io/library/python:3.13.0-slim@sha256:eda73cafdf051c76185375d7c20c4  0.0s
 => [backend internal] load build context                                                 3.0s
 => => transferring context: 122.35MB                                                     2.9s
 => CACHED [frontend 2/7] WORKDIR /app                                                    0.0s
 => [frontend 3/7] COPY package*.json .                                                   0.5s
 => [frontend 4/7] RUN npm install                                                      152.1s
 => CACHED [backend 2/8] WORKDIR /app                                                     0.0s
 => CACHED [backend 3/8] COPY requirements.txt .                                          0.0s
 => CACHED [backend 4/8] RUN pip install --upgrade pip                                    0.0s
 => CACHED [backend 5/8] RUN pip install -r requirements.txt                              0.0s
 => [backend 6/8] COPY . .                                                                1.7s
 => [backend 7/8] COPY entrypoint.sh /app/entrypoint.sh                                   0.1s
 => [backend 8/8] RUN chmod +x /app/entrypoint.sh                                         0.5s
 => [backend] exporting to image                                                          1.3s
 => => exporting layers                                                                   1.2s
 => => writing image sha256:d90a68b631555b980321e29023ca2f605e7e5dabbb512d1e32f644ea25fd  0.0s
 => => naming to docker.io/library/cc_guardian_backend                                    0.0s
 => [backend] resolving provenance for metadata file                                      0.0s
 => [frontend 5/7] COPY . .                                                               4.3s
 => [frontend 6/7] COPY update_env.sh /app/update_env.sh                                  0.1s
 => [frontend 7/7] RUN chmod +x /app/update_env.sh                                        0.4s
 => [frontend] exporting to image                                                         1.1s
 => => exporting layers                                                                   1.0s
 => => writing image sha256:375403eeac5e577b9bea6d3b33d6f189f91afb10a31f78cec587e6823770  0.0s
 => => naming to docker.io/library/cc_guardian_frontend                                   0.0s
 => [frontend] resolving provenance for metadata file                                     0.0s
[+] Running 3/3
 ✔ Network cc-guardian_cc_guardian_net  Created                                           0.3s
 ✔ Container frontend                   Created                                           1.0s
 ✔ Container backend                    Created                                           0.1s
Attaching to backend, frontend
frontend  |
frontend  | > frontend@0.0.0 dev
frontend  | > vite
frontend  |
frontend  | Re-optimizing dependencies because vite config has changed
frontend  |
frontend  |   VITE v5.4.11  ready in 230 ms
frontend  |
frontend  |   ➜  Local:   http://localhost:3000/
frontend  |   ➜  Network: http://172.18.0.2:3000/
backend   | No changes detected
backend   | Operations to perform:
backend   |   Apply all migrations: admin, api, auth, contenttypes, sessions, user_api
backend   | Running migrations:
backend   |   No migrations to apply.
backend   | 2024-12-03 10:38:48,802 INFO     Starting server at tcp:port=8000:interface=0.0.0.0
backend   | 2024-12-03 10:38:48,803 INFO     HTTP/2 support not enabled (install the http2 and tls Twisted extras)
backend   | 2024-12-03 10:38:48,803 INFO     Configuring endpoint tcp:port=8000:interface=0.0.0.0
backend   | 2024-12-03 10:38:48,804 INFO     Listening on TCP address 0.0.0.0:8000

Daphne logs:

❯ daphne -b 0.0.0.0 -p 8000 backend.asgi:application

2024-12-03 10:44:58,062 INFO     Starting server at tcp:port=8000:interface=0.0.0.0
2024-12-03 10:44:58,062 INFO     HTTP/2 support not enabled (install the http2 and tls Twisted extras)
2024-12-03 10:44:58,063 INFO     Configuring endpoint tcp:port=8000:interface=0.0.0.0
2024-12-03 10:44:58,063 INFO     Listening on TCP address 0.0.0.0:8000
127.0.0.1:37310 - - [03/Dec/2024:10:45:03] "GET /api/users/user" 200 95
127.0.0.1:37292 - - [03/Dec/2024:10:45:03] "GET /api/users/user" 200 95
127.0.0.1:37314 - - [03/Dec/2024:10:45:03] "GET /api/users/user" 200 95
127.0.0.1:37304 - - [03/Dec/2024:10:45:03] "GET /api/users/user" 200 95
127.0.0.1:37326 - - [03/Dec/2024:10:45:03] "WSCONNECTING /ws/server-update-status/" - -
<ASGIRequest: GET '/api/user-linked-servers/'>
<ASGIRequest: GET '/api/user-linked-servers/'>
WebSocket connect: ['127.0.0.1', 37326]
<QuerySet ['brcwbvr72', 'brcwbvr96', 'brcwbvr98']>
<QuerySet ['brcwbvr72', 'brcwbvr96', 'brcwbvr98']>
127.0.0.1:37326 - - [03/Dec/2024:10:45:03] "WSCONNECT /ws/server-update-status/" - -
127.0.0.1:37292 - - [03/Dec/2024:10:45:03] "GET /api/user-linked-servers/" 200 2015
127.0.0.1:37310 - - [03/Dec/2024:10:45:03] "GET /api/user-linked-servers/" 200 2015

Syslogs:

❯ sudo tail -n 50 /var/log/syslog

2024-12-03T07:44:25.503827-03:00 BRCWBNB0553 containerd[285]: time="2024-12-03T07:44:25.503507216-03:00" level=info msg="shim disconnected" id=422b4ae7ed2f919a32a1459ae259f5fc28a9a471cca7087884810503a0bdebdf namespace=moby
2024-12-03T07:44:25.503952-03:00 BRCWBNB0553 containerd[285]: time="2024-12-03T07:44:25.503589675-03:00" level=warning msg="cleaning up after shim disconnected" id=422b4ae7ed2f919a32a1459ae259f5fc28a9a471cca7087884810503a0bdebdf namespace=moby
2024-12-03T07:44:25.503995-03:00 BRCWBNB0553 containerd[285]: time="2024-12-03T07:44:25.503603884-03:00" level=info msg="cleaning up dead shim" namespace=moby
2024-12-03T07:44:25.504118-03:00 BRCWBNB0553 dockerd[393]: time="2024-12-03T07:44:25.503626476-03:00" level=info msg="ignoring event" container=422b4ae7ed2f919a32a1459ae259f5fc28a9a471cca7087884810503a0bdebdf module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
2024-12-03T07:44:25.758080-03:00 BRCWBNB0553 kernel: br-5a5d1ea7553a: port 2(vethd2a9530) entered disabled state
2024-12-03T07:44:25.758115-03:00 BRCWBNB0553 kernel: vethc3d79f3: renamed from eth0
2024-12-03T07:44:25.888086-03:00 BRCWBNB0553 kernel: br-5a5d1ea7553a: port 2(vethd2a9530) entered disabled state
2024-12-03T07:44:25.888122-03:00 BRCWBNB0553 kernel: device vethd2a9530 left promiscuous mode
2024-12-03T07:44:25.888124-03:00 BRCWBNB0553 kernel: br-5a5d1ea7553a: port 2(vethd2a9530) entered disabled state
2024-12-03T07:44:26.018110-03:00 BRCWBNB0553 systemd[1]: run-docker-netns-2122f4a10147.mount: Deactivated successfully.
2024-12-03T07:44:26.022754-03:00 BRCWBNB0553 systemd[1]: var-lib-docker-overlay2-9ed8f6c919f0ad90e013a73e307337681b8544b382842a99727e1e6da9522d8b-merged.mount: Deactivated successfully.
2024-12-03T07:44:35.433715-03:00 BRCWBNB0553 dockerd[393]: time="2024-12-03T07:44:35.433438145-03:00" level=info msg="Container failed to exit within 10s of signal 15 - using the force" container=bf50cf231c0429521a441627445650f1a9db4842182f520e361366ed5dab193f spanID=70e9e0f5a17e6906 traceID=112893ae5477deec8861047c6153cdd9
2024-12-03T07:44:35.460426-03:00 BRCWBNB0553 dockerd[393]: time="2024-12-03T07:44:35.460342061-03:00" level=info msg="ignoring event" container=bf50cf231c0429521a441627445650f1a9db4842182f520e361366ed5dab193f module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
2024-12-03T07:44:35.461098-03:00 BRCWBNB0553 containerd[285]: time="2024-12-03T07:44:35.460794792-03:00" level=info msg="shim disconnected" id=bf50cf231c0429521a441627445650f1a9db4842182f520e361366ed5dab193f namespace=moby
2024-12-03T07:44:35.461187-03:00 BRCWBNB0553 containerd[285]: time="2024-12-03T07:44:35.460871968-03:00" level=warning msg="cleaning up after shim disconnected" id=bf50cf231c0429521a441627445650f1a9db4842182f520e361366ed5dab193f namespace=moby
2024-12-03T07:44:35.461251-03:00 BRCWBNB0553 containerd[285]: time="2024-12-03T07:44:35.460885178-03:00" level=info msg="cleaning up dead shim" namespace=moby
2024-12-03T07:44:35.728129-03:00 BRCWBNB0553 kernel: br-5a5d1ea7553a: port 1(veth7f7281c) entered disabled state
2024-12-03T07:44:35.728172-03:00 BRCWBNB0553 kernel: veth8ef63d7: renamed from eth0
2024-12-03T07:44:35.868039-03:00 BRCWBNB0553 kernel: br-5a5d1ea7553a: port 1(veth7f7281c) entered disabled state
2024-12-03T07:44:35.878037-03:00 BRCWBNB0553 kernel: device veth7f7281c left promiscuous mode
2024-12-03T07:44:35.878078-03:00 BRCWBNB0553 kernel: br-5a5d1ea7553a: port 1(veth7f7281c) entered disabled state
2024-12-03T07:44:36.039950-03:00 BRCWBNB0553 systemd[1]: run-docker-netns-d9d6fdf8c9aa.mount: Deactivated successfully.
2024-12-03T07:44:36.044912-03:00 BRCWBNB0553 systemd[1]: var-lib-docker-overlay2-28008de5d0fd16cd3643a0245248e12b04a655a7a402fec789ccddc6345fdbe2-merged.mount: Deactivated successfully.
2024-12-03T07:45:16.843768-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[36mINFO#033[0m Daemon: connecting to Windows Agent
2024-12-03T07:45:16.844237-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[37mDEBUG#033[0m Updated systemd status to "Connecting"
2024-12-03T07:45:16.845528-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[33mWARNING#033[0m Daemon: could not connect to Windows Agent: could not get address: could not read agent port file "/mnt/c/Users/lopesl/.ubuntupro/.address": open /mnt/c/Users/lopesl/.ubuntupro/.address: no such file or directory
2024-12-03T07:45:16.845612-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[36mINFO#033[0m Reconnecting to Windows host in 60 seconds
2024-12-03T07:45:16.845682-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[37mDEBUG#033[0m Updated systemd status to "Not connected: waiting to retry"
2024-12-03T07:46:16.901176-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[36mINFO#033[0m Daemon: connecting to Windows Agent
2024-12-03T07:46:16.908143-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[37mDEBUG#033[0m Updated systemd status to "Connecting"
2024-12-03T07:46:16.908205-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[33mWARNING#033[0m Daemon: could not connect to Windows Agent: could not get address: could not read agent port file "/mnt/c/Users/lopesl/.ubuntupro/.address": open /mnt/c/Users/lopesl/.ubuntupro/.address: no such file or directory
2024-12-03T07:46:16.908213-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[36mINFO#033[0m Reconnecting to Windows host in 60 seconds
2024-12-03T07:46:16.908221-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[37mDEBUG#033[0m Updated systemd status to "Not connected: waiting to retry"
2024-12-03T07:47:04.194432-03:00 BRCWBNB0553 PackageKit: daemon quit
2024-12-03T07:47:04.229150-03:00 BRCWBNB0553 systemd[1]: packagekit.service: Deactivated successfully.
2024-12-03T07:47:16.911036-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[36mINFO#033[0m Daemon: connecting to Windows Agent
2024-12-03T07:47:16.911168-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[37mDEBUG#033[0m Updated systemd status to "Connecting"
2024-12-03T07:47:16.912393-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[33mWARNING#033[0m Daemon: could not connect to Windows Agent: could not get address: could not read agent port file "/mnt/c/Users/lopesl/.ubuntupro/.address": open /mnt/c/Users/lopesl/.ubuntupro/.address: no such file or directory
2024-12-03T07:47:16.912443-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[36mINFO#033[0m Reconnecting to Windows host in 60 seconds
2024-12-03T07:47:16.912466-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[37mDEBUG#033[0m Updated systemd status to "Not connected: waiting to retry"
2024-12-03T07:48:16.913269-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[36mINFO#033[0m Daemon: connecting to Windows Agent
2024-12-03T07:48:16.913473-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[37mDEBUG#033[0m Updated systemd status to "Connecting"
2024-12-03T07:48:16.916479-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[33mWARNING#033[0m Daemon: could not connect to Windows Agent: could not get address: could not read agent port file "/mnt/c/Users/lopesl/.ubuntupro/.address": open /mnt/c/Users/lopesl/.ubuntupro/.address: no such file or directory
2024-12-03T07:48:16.916513-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[36mINFO#033[0m Reconnecting to Windows host in 60 seconds
2024-12-03T07:48:16.916554-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[37mDEBUG#033[0m Updated systemd status to "Not connected: waiting to retry"
2024-12-03T07:49:16.937556-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[36mINFO#033[0m Daemon: connecting to Windows Agent
2024-12-03T07:49:16.937765-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[37mDEBUG#033[0m Updated systemd status to "Connecting"
2024-12-03T07:49:16.939928-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[33mWARNING#033[0m Daemon: could not connect to Windows Agent: could not get address: could not read agent port file "/mnt/c/Users/lopesl/.ubuntupro/.address": open /mnt/c/Users/lopesl/.ubuntupro/.address: no such file or directory
2024-12-03T07:49:16.940001-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[36mINFO#033[0m Reconnecting to Windows host in 60 seconds
2024-12-03T07:49:16.940020-03:00 BRCWBNB0553 wsl-pro-service[249]: #033[37mDEBUG#033[0m Updated systemd status to "Not connected: waiting to retry"