Compare commits
11 Commits
b800f0308d
...
1.1
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
7fd69a9c3f | ||
|
|
0dfef3e611 | ||
|
|
707645ab36 | ||
|
|
df55660291 | ||
|
|
e8e1d8b60d | ||
|
|
ba3d8dded5 | ||
|
|
d7c0fdd091 | ||
|
|
f168d0c60a | ||
|
|
0cbec6477f | ||
|
|
a55365259c | ||
|
|
d529568798 |
4
.gitignore
vendored
4
.gitignore
vendored
@@ -45,3 +45,7 @@ searchindex.json
|
|||||||
|
|
||||||
# Temporary files
|
# Temporary files
|
||||||
C:\Temp_SP\
|
C:\Temp_SP\
|
||||||
|
|
||||||
|
# Project tooling / AI assistant files
|
||||||
|
tasks/
|
||||||
|
.claude/
|
||||||
|
|||||||
49
README.md
49
README.md
@@ -6,13 +6,27 @@ En moderne Python-baseret fil-browser til Microsoft SharePoint, specielt designe
|
|||||||
|
|
||||||
- **Ingen Sti-begrænsning (MAX_PATH):** Problemfri og pålidelig redigering uanset mappedybde i SharePoint.
|
- **Ingen Sti-begrænsning (MAX_PATH):** Problemfri og pålidelig redigering uanset mappedybde i SharePoint.
|
||||||
- **Sikker Redigering & Kollision-beskyttelse:** Automatisk Check-out/Check-in og intelligent overvågning af fil-låse lokalt.
|
- **Sikker Redigering & Kollision-beskyttelse:** Automatisk Check-out/Check-in og intelligent overvågning af fil-låse lokalt.
|
||||||
- **Professionel Brugerflade:** Bygget med `wxPython` (Native Windows UI) inklusiv indfødte OS-ikoner, brødkrummesti (breadcrumbs) og lazy-loading af hierarkisk træstruktur for markant hurtigere navigation.
|
- **Professionel Brugerflade:** Bygget med `wxPython` (Native Windows UI) inklusiv indfødte OS-ikoner, data-drevet brødkrummesti (breadcrumbs) og lazy-loading af hierarkisk træstruktur for markant hurtigere navigation.
|
||||||
|
- **Paginering & Stor-tenant understøttelse:** Alle Graph API-kald følger `@odata.nextLink` automatisk, så mapper og sites med hundredvis af elementer indlæses korrekt uden tab.
|
||||||
|
- **Statuslinje med fremgangsmåler:** En integreret statusmåler (gauge) i statuslinjen giver visuelt feedback under alle netværksoperationer.
|
||||||
- **First-Run Setup Wizard:** Automatisk konfigurationsguide ved første opstart, der opsamler Client ID og Tenant ID (kræver ingen forudgående manuel `settings.json`).
|
- **First-Run Setup Wizard:** Automatisk konfigurationsguide ved første opstart, der opsamler Client ID og Tenant ID (kræver ingen forudgående manuel `settings.json`).
|
||||||
- **Avanceret Søgning:** Hurtig global søgefunktion der bygger på et lokalt, trinvist opdateret indeks, samt understøttelse af "OG"-logik (AND logic).
|
- **Avanceret Søgning:** Hurtig global søgefunktion der bygger på et lokalt, trinvist opdateret indeks, samt understøttelse af "OG"-logik (AND logic).
|
||||||
- **Fuld Fil- og Mappestyring:** Understøtter upload, sletning og omdøbning, samt visning af udvidet fil-metadata (filstørrelse, redigeringsdato).
|
- **Fuld Fil- og Mappestyring:** Understøtter upload, sletning og omdøbning, samt visning af udvidet fil-metadata (filstørrelse, redigeringsdato).
|
||||||
- **Multisprog:** Indbygget og brugerstyret understøttelse af både Dansk og Engelsk-grænseflade.
|
- **Multisprog:** Indbygget og brugerstyret understøttelse af både Dansk og Engelsk-grænseflade.
|
||||||
- **Multi-File Editing:** Robust understøttelse for lokalt at redigere flere forskellige filer uafhængigt af hinanden i baggrunden uden at interface fryser.
|
- **Multi-File Editing:** Robust understøttelse for lokalt at redigere flere forskellige filer uafhængigt af hinanden i baggrunden uden at interface fryser.
|
||||||
|
|
||||||
|
## ⚙️ Indstillinger
|
||||||
|
|
||||||
|
Indstillingsdialogen er organiseret i faner:
|
||||||
|
|
||||||
|
| Fane | Indhold |
|
||||||
|
|------|---------|
|
||||||
|
| **Konto** | Azure Client ID og Tenant ID |
|
||||||
|
| **Stier** | Midlertidig downloadmappe og app-placering |
|
||||||
|
| **Licens** | Licensnøgle og aktiveringsstatus |
|
||||||
|
| **System** | Sprog og log-output til fejlfinding |
|
||||||
|
| **Om** | Versionsinformation og kontaktoplysninger |
|
||||||
|
|
||||||
## 🛠️ Teknologier
|
## 🛠️ Teknologier
|
||||||
|
|
||||||
- **Sprog:** Python 3.x
|
- **Sprog:** Python 3.x
|
||||||
@@ -31,7 +45,7 @@ pip install wxPython msal requests
|
|||||||
```
|
```
|
||||||
|
|
||||||
### Kør applikationen
|
### Kør applikationen
|
||||||
Star op med:
|
Start op med:
|
||||||
```bash
|
```bash
|
||||||
python sharepoint_browser.py
|
python sharepoint_browser.py
|
||||||
```
|
```
|
||||||
@@ -39,21 +53,34 @@ python sharepoint_browser.py
|
|||||||
Ved første kørsel uden en konfiguration vil applikationen præsentere en Setup Wizard, hvor man nemt kan indtaste Microsoft-loginoplysningerne. Indtastningerne gemmes i en lokal `settings.json` fil.
|
Ved første kørsel uden en konfiguration vil applikationen præsentere en Setup Wizard, hvor man nemt kan indtaste Microsoft-loginoplysningerne. Indtastningerne gemmes i en lokal `settings.json` fil.
|
||||||
|
|
||||||
## 🏗️ Byg til EXE (Valgfrit)
|
## 🏗️ Byg til EXE (Valgfrit)
|
||||||
For at pakke programmet til en uafhængig, "kør-bar" `.exe` fil til Windows, kan dette gøres med PyInstaller:
|
For at pakke programmet til en uafhængig, kørbar `.exe` fil til Windows bruges den medfølgende PyInstaller spec-fil:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
pip install pyinstaller
|
pip install pyinstaller
|
||||||
python -m PyInstaller --windowed --onefile --icon=icon.ico --name "SharePoint Browser" sharepoint_browser.py
|
python -m PyInstaller "SharePoint Explorer.spec" --noconfirm
|
||||||
```
|
```
|
||||||
*(Husk at afhæningheder og ikoner skal inddrages formelt i dit build afhængigt af din PyInstaller spec-fil).*
|
|
||||||
|
Den færdige fil placeres i `dist/SharePoint Explorer.exe` med ikon indlejret.
|
||||||
|
|
||||||
## 🧩 Arkitektur & Workflow
|
## 🧩 Arkitektur & Workflow
|
||||||
1. **Godkendelse:** Det autentificerer brugeren via MSAL & MS Graph API.
|
1. **Godkendelse:** Autentificerer brugeren via MSAL & MS Graph API.
|
||||||
2. **Navigation:** Data hentes asynkront (lazy-loading). Alt håndteres med ID'er istedet for filstier, hvilket sikrer MAX_PATH-modstandsdygtighed.
|
2. **Navigation:** Data hentes asynkront (lazy-loading) og chunked — de første resultater vises straks mens resten streames ind. Alt håndteres med ID'er i stedet for filstier, hvilket sikrer MAX_PATH-modstandsdygtighed.
|
||||||
3. **Baggrundshåndtering af redigering:**
|
3. **Navigationskontekst:** Brødkrummestien er data-drevet; hvert segment gemmer det fulde navigationsobjekt, så klik på en forælder altid navigerer korrekt uden at gennemsøge træet.
|
||||||
|
4. **Stale-result beskyttelse:** En navigations-tæller (`_nav_gen`) sikrer, at svar fra afbrudte netværkskald aldrig overskriver den aktive mappevisning.
|
||||||
|
5. **Baggrundshåndtering af redigering:**
|
||||||
- Filer tjekkes ud (`Checkout`) direkte i SharePoint.
|
- Filer tjekkes ud (`Checkout`) direkte i SharePoint.
|
||||||
- Hentes ned til det lokale drev under korte stier under C-drevet eks. `C:\Temp_SP\[MD5-Hash].[ext]`.
|
- Hentes ned til det lokale drev under korte stier, eks. `C:\Temp_SP\[MD5-Hash].[ext]`.
|
||||||
- Et baggrunds-thread overvåger derefter det lokale program (fx Word) kontinuerligt via `os.rename()` tricket.
|
- Et baggrunds-thread overvåger det lokale program (fx Word) kontinuerligt via `os.rename()` tricket.
|
||||||
- Når filen lukkes fra dit office-program, uploades ændringerne op til SharePoint og modtager et `Checkin`.
|
- Når filen lukkes, uploades ændringerne til SharePoint og modtager et `Checkin`.
|
||||||
|
|
||||||
|
## 🧪 Tests
|
||||||
|
En unit-test suite dækker de kritiske logik-komponenter:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python -m unittest tests.test_review_fixes -v
|
||||||
|
```
|
||||||
|
|
||||||
|
Testene kører uden skærm/display og dækker: navigations-tæller, oversættelses-nøgler, URL-initialisering og signaturer.
|
||||||
|
|
||||||
## 💡 Backlog / Kommende muligheder
|
## 💡 Backlog / Kommende muligheder
|
||||||
1. Integration for håndtering af flere tenants (lejemål)
|
1. Integration for håndtering af flere tenants (lejemål)
|
||||||
|
|||||||
@@ -67,13 +67,36 @@ def load_settings():
|
|||||||
with open(SETTINGS_FILE, 'r', encoding='utf-8') as f:
|
with open(SETTINGS_FILE, 'r', encoding='utf-8') as f:
|
||||||
try:
|
try:
|
||||||
return json.load(f)
|
return json.load(f)
|
||||||
except:
|
except Exception:
|
||||||
return default_settings
|
return default_settings
|
||||||
|
|
||||||
def save_settings(new_settings):
|
def save_settings(new_settings):
|
||||||
with open(SETTINGS_FILE, 'w', encoding='utf-8') as f:
|
with open(SETTINGS_FILE, 'w', encoding='utf-8') as f:
|
||||||
json.dump(new_settings, f, indent=4)
|
json.dump(new_settings, f, indent=4)
|
||||||
|
|
||||||
|
# --- GRAPH API REQUEST HELPER ---
|
||||||
|
_RETRY_STATUSES = {429, 503}
|
||||||
|
_MAX_RETRIES = 3
|
||||||
|
|
||||||
|
def _graph_request(method, url, **kwargs):
|
||||||
|
"""Thin wrapper around requests.request with retry for 429/503.
|
||||||
|
|
||||||
|
Retries up to _MAX_RETRIES times when Graph API signals rate limiting
|
||||||
|
(429) or transient unavailability (503), honouring the Retry-After header
|
||||||
|
when present. A default timeout of 30 s is injected if the caller does
|
||||||
|
not supply one. File-upload calls that pass an open stream as data=
|
||||||
|
should use requests.put() directly, since a stream cannot be re-read.
|
||||||
|
"""
|
||||||
|
kwargs.setdefault("timeout", 30)
|
||||||
|
for attempt in range(_MAX_RETRIES):
|
||||||
|
res = requests.request(method, url, **kwargs)
|
||||||
|
if res.status_code not in _RETRY_STATUSES:
|
||||||
|
return res
|
||||||
|
if attempt < _MAX_RETRIES - 1: # Don't sleep after the final attempt
|
||||||
|
retry_after = int(res.headers.get("Retry-After", 2 ** attempt))
|
||||||
|
time.sleep(min(retry_after, 60))
|
||||||
|
return res # Return last response after exhausting retries
|
||||||
|
|
||||||
settings = load_settings()
|
settings = load_settings()
|
||||||
CLIENT_ID = settings.get("client_id")
|
CLIENT_ID = settings.get("client_id")
|
||||||
TENANT_ID = settings.get("tenant_id")
|
TENANT_ID = settings.get("tenant_id")
|
||||||
@@ -671,9 +694,11 @@ class SharePointApp(wx.Frame):
|
|||||||
self.current_items = [] # Gemmer graf-objekterne for rækkerne
|
self.current_items = [] # Gemmer graf-objekterne for rækkerne
|
||||||
self.tree_item_data = {} # Mappenoder -> {type, id, name, drive_id, path}
|
self.tree_item_data = {} # Mappenoder -> {type, id, name, drive_id, path}
|
||||||
self.current_path_data = [] # Gemmer data-objekterne for den nuværende sti (brødkrummer)
|
self.current_path_data = [] # Gemmer data-objekterne for den nuværende sti (brødkrummer)
|
||||||
|
self._nav_gen = 0 # Incremented on each navigation to discard stale fetch results
|
||||||
self.tree_root = None
|
self.tree_root = None
|
||||||
self.is_navigating_back = False
|
self.is_navigating_back = False
|
||||||
self.active_edits = {} # item_id -> { "name": name, "event": Event, "waiting": bool }
|
self.active_edits = {} # item_id -> { "name": name, "event": Event, "waiting": bool }
|
||||||
|
self._edits_lock = threading.Lock()
|
||||||
self.favorites = settings.get("favorites", [])
|
self.favorites = settings.get("favorites", [])
|
||||||
self.fav_visible = settings.get("fav_visible", True)
|
self.fav_visible = settings.get("fav_visible", True)
|
||||||
self.sort_col = 0 # Default (Navn)
|
self.sort_col = 0 # Default (Navn)
|
||||||
@@ -707,7 +732,7 @@ class SharePointApp(wx.Frame):
|
|||||||
try:
|
try:
|
||||||
self.msal_app = msal.PublicClientApplication(CLIENT_ID, authority=AUTHORITY)
|
self.msal_app = msal.PublicClientApplication(CLIENT_ID, authority=AUTHORITY)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(f"MSAL Init Error: {e}")
|
logger.error(f"MSAL Init Error: {e}")
|
||||||
|
|
||||||
self.InitUI()
|
self.InitUI()
|
||||||
self.Centre()
|
self.Centre()
|
||||||
@@ -736,7 +761,7 @@ class SharePointApp(wx.Frame):
|
|||||||
if kwargs:
|
if kwargs:
|
||||||
try:
|
try:
|
||||||
return text.format(**kwargs)
|
return text.format(**kwargs)
|
||||||
except:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
return text
|
return text
|
||||||
|
|
||||||
@@ -1169,7 +1194,7 @@ class SharePointApp(wx.Frame):
|
|||||||
status_text = "Sletter" if self.lang == "da" else "Deleting"
|
status_text = "Sletter" if self.lang == "da" else "Deleting"
|
||||||
self.set_status(f"{status_text} {count+1}/{total}: '{item['name']}'...")
|
self.set_status(f"{status_text} {count+1}/{total}: '{item['name']}'...")
|
||||||
url = f"https://graph.microsoft.com/v1.0/drives/{item['drive_id']}/items/{item['id']}"
|
url = f"https://graph.microsoft.com/v1.0/drives/{item['drive_id']}/items/{item['id']}"
|
||||||
res = requests.delete(url, headers=self.headers)
|
res = _graph_request("DELETE", url, headers=self.headers, timeout=30)
|
||||||
if res.status_code in [204, 200]:
|
if res.status_code in [204, 200]:
|
||||||
count += 1
|
count += 1
|
||||||
else:
|
else:
|
||||||
@@ -1196,7 +1221,7 @@ class SharePointApp(wx.Frame):
|
|||||||
url = f"https://graph.microsoft.com/v1.0/drives/{drive_id}/items/{parent_id}:/{filename}:/content"
|
url = f"https://graph.microsoft.com/v1.0/drives/{drive_id}/items/{parent_id}:/{filename}:/content"
|
||||||
try:
|
try:
|
||||||
with open(local_path, 'rb') as f:
|
with open(local_path, 'rb') as f:
|
||||||
res = requests.put(url, headers=self.headers, data=f)
|
res = requests.put(url, headers=self.headers, data=f, timeout=120)
|
||||||
if res.status_code in [200, 201]:
|
if res.status_code in [200, 201]:
|
||||||
self.set_status(self.get_txt("msg_upload_success", name=filename))
|
self.set_status(self.get_txt("msg_upload_success", name=filename))
|
||||||
self._refresh_current_view()
|
self._refresh_current_view()
|
||||||
@@ -1237,7 +1262,7 @@ class SharePointApp(wx.Frame):
|
|||||||
def _create_folder_sync(self, name, drive_id, parent_id):
|
def _create_folder_sync(self, name, drive_id, parent_id):
|
||||||
url = f"https://graph.microsoft.com/v1.0/drives/{drive_id}/items/{parent_id}/children"
|
url = f"https://graph.microsoft.com/v1.0/drives/{drive_id}/items/{parent_id}/children"
|
||||||
body = {"name": name, "folder": {}, "@microsoft.graph.conflictBehavior": "rename"}
|
body = {"name": name, "folder": {}, "@microsoft.graph.conflictBehavior": "rename"}
|
||||||
res = requests.post(url, headers=self.headers, json=body)
|
res = _graph_request("POST", url, headers=self.headers, json=body, timeout=30)
|
||||||
if res.status_code in [200, 201]:
|
if res.status_code in [200, 201]:
|
||||||
return res.json().get('id')
|
return res.json().get('id')
|
||||||
return None
|
return None
|
||||||
@@ -1247,7 +1272,7 @@ class SharePointApp(wx.Frame):
|
|||||||
filename = os.path.basename(local_path)
|
filename = os.path.basename(local_path)
|
||||||
url = f"https://graph.microsoft.com/v1.0/drives/{drive_id}/items/{parent_id}:/{filename}:/content"
|
url = f"https://graph.microsoft.com/v1.0/drives/{drive_id}/items/{parent_id}:/{filename}:/content"
|
||||||
with open(local_path, 'rb') as f:
|
with open(local_path, 'rb') as f:
|
||||||
requests.put(url, headers=self.headers, data=f)
|
requests.put(url, headers=self.headers, data=f, timeout=120)
|
||||||
|
|
||||||
def on_new_folder_clicked(self, event):
|
def on_new_folder_clicked(self, event):
|
||||||
if not self.current_drive_id: return
|
if not self.current_drive_id: return
|
||||||
@@ -1281,7 +1306,7 @@ class SharePointApp(wx.Frame):
|
|||||||
self.set_status(f"{self.get_txt('msg_rename')}...")
|
self.set_status(f"{self.get_txt('msg_rename')}...")
|
||||||
url = f"https://graph.microsoft.com/v1.0/drives/{item['drive_id']}/items/{item['id']}"
|
url = f"https://graph.microsoft.com/v1.0/drives/{item['drive_id']}/items/{item['id']}"
|
||||||
body = {"name": new_name}
|
body = {"name": new_name}
|
||||||
res = requests.patch(url, headers=self.headers, json=body)
|
res = _graph_request("PATCH", url, headers=self.headers, json=body, timeout=30)
|
||||||
if res.status_code in [200, 201]:
|
if res.status_code in [200, 201]:
|
||||||
self.set_status(self.get_txt("msg_success"))
|
self.set_status(self.get_txt("msg_success"))
|
||||||
self._refresh_current_view()
|
self._refresh_current_view()
|
||||||
@@ -1302,7 +1327,7 @@ class SharePointApp(wx.Frame):
|
|||||||
with wx.DirDialog(self, self.get_txt("msg_select_folder"), style=wx.DD_DEFAULT_STYLE | wx.DD_DIR_MUST_EXIST) as dd:
|
with wx.DirDialog(self, self.get_txt("msg_select_folder"), style=wx.DD_DEFAULT_STYLE | wx.DD_DIR_MUST_EXIST) as dd:
|
||||||
if dd.ShowModal() == wx.ID_OK:
|
if dd.ShowModal() == wx.ID_OK:
|
||||||
parent_path = dd.GetPath()
|
parent_path = dd.GetPath()
|
||||||
dest_path = os.path.join(parent_path, item['name'])
|
dest_path = os.path.join(parent_path, os.path.basename(item['name']))
|
||||||
threading.Thread(target=self._download_folder_bg_task, args=(item, dest_path), daemon=True).start()
|
threading.Thread(target=self._download_folder_bg_task, args=(item, dest_path), daemon=True).start()
|
||||||
|
|
||||||
def _download_file_bg_task(self, item, dest_path):
|
def _download_file_bg_task(self, item, dest_path):
|
||||||
@@ -1318,7 +1343,7 @@ class SharePointApp(wx.Frame):
|
|||||||
|
|
||||||
def _download_file_sync_call(self, drive_id, item_id, dest_path):
|
def _download_file_sync_call(self, drive_id, item_id, dest_path):
|
||||||
url = f"https://graph.microsoft.com/v1.0/drives/{drive_id}/items/{item_id}/content"
|
url = f"https://graph.microsoft.com/v1.0/drives/{drive_id}/items/{item_id}/content"
|
||||||
res = requests.get(url, headers=self.headers)
|
res = _graph_request("GET", url, headers=self.headers, timeout=30)
|
||||||
if res.status_code == 200:
|
if res.status_code == 200:
|
||||||
with open(dest_path, 'wb') as f:
|
with open(dest_path, 'wb') as f:
|
||||||
f.write(res.content)
|
f.write(res.content)
|
||||||
@@ -1337,12 +1362,12 @@ class SharePointApp(wx.Frame):
|
|||||||
|
|
||||||
url = f"https://graph.microsoft.com/v1.0/drives/{drive_id}/items/{folder_id}/children"
|
url = f"https://graph.microsoft.com/v1.0/drives/{drive_id}/items/{folder_id}/children"
|
||||||
while url:
|
while url:
|
||||||
res = requests.get(url, headers=self.headers)
|
res = _graph_request("GET", url, headers=self.headers, timeout=30)
|
||||||
if res.status_code == 200:
|
if res.status_code == 200:
|
||||||
res_data = res.json()
|
res_data = res.json()
|
||||||
items = res_data.get('value', [])
|
items = res_data.get('value', [])
|
||||||
for item in items:
|
for item in items:
|
||||||
item_path = os.path.join(local_dir, item['name'])
|
item_path = os.path.join(local_dir, os.path.basename(item['name']))
|
||||||
if 'folder' in item:
|
if 'folder' in item:
|
||||||
self._download_folder_recursive_sync(drive_id, item['id'], item_path)
|
self._download_folder_recursive_sync(drive_id, item['id'], item_path)
|
||||||
else:
|
else:
|
||||||
@@ -1381,7 +1406,8 @@ class SharePointApp(wx.Frame):
|
|||||||
wx.CallAfter(_do)
|
wx.CallAfter(_do)
|
||||||
|
|
||||||
def on_done_editing_clicked(self, event):
|
def on_done_editing_clicked(self, event):
|
||||||
waiting_files = [fid for fid, d in self.active_edits.items() if d.get("waiting")]
|
with self._edits_lock:
|
||||||
|
waiting_files = [fid for fid, d in self.active_edits.items() if d.get("waiting")]
|
||||||
if not waiting_files:
|
if not waiting_files:
|
||||||
return
|
return
|
||||||
|
|
||||||
@@ -1643,7 +1669,7 @@ class SharePointApp(wx.Frame):
|
|||||||
btn.Bind(wx.EVT_BUTTON, self.load_sites)
|
btn.Bind(wx.EVT_BUTTON, self.load_sites)
|
||||||
elif data:
|
elif data:
|
||||||
def on_click(e, d=data):
|
def on_click(e, d=data):
|
||||||
self._navigate_to_item_data(d, is_breadcrumb=True)
|
self._navigate_to_item_data(d)
|
||||||
# Efter navigation, prøv at finde og vælge den i træet
|
# Efter navigation, prøv at finde og vælge den i træet
|
||||||
wx.CallAfter(self._sync_tree_selection_by_path, d["path"])
|
wx.CallAfter(self._sync_tree_selection_by_path, d["path"])
|
||||||
|
|
||||||
@@ -1665,7 +1691,7 @@ class SharePointApp(wx.Frame):
|
|||||||
self.headers = {'Authorization': f'Bearer {self.access_token}'}
|
self.headers = {'Authorization': f'Bearer {self.access_token}'}
|
||||||
return True
|
return True
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(f"Token refresh error: {e}")
|
logger.error(f"Token refresh error: {e}")
|
||||||
|
|
||||||
self.set_status(self.get_txt("status_login_needed"))
|
self.set_status(self.get_txt("status_login_needed"))
|
||||||
return False
|
return False
|
||||||
@@ -1716,13 +1742,12 @@ class SharePointApp(wx.Frame):
|
|||||||
url = "https://graph.microsoft.com/v1.0/sites?search=*"
|
url = "https://graph.microsoft.com/v1.0/sites?search=*"
|
||||||
|
|
||||||
while url:
|
while url:
|
||||||
res = requests.get(url, headers=self.headers)
|
res = _graph_request("GET", url, headers=self.headers, timeout=30)
|
||||||
if res.status_code == 200:
|
if res.status_code == 200:
|
||||||
data = res.json()
|
data = res.json()
|
||||||
all_sites.extend(data.get('value', []))
|
all_sites.extend(data.get('value', []))
|
||||||
url = data.get('@odata.nextLink')
|
url = data.get('@odata.nextLink')
|
||||||
self.set_status(f"{self.get_txt('status_fetching_sites')} ({len(all_sites)}...)")
|
self.set_status(f"{self.get_txt('status_fetching_sites')} ({len(all_sites)}...)")
|
||||||
self.pulse_gauge(True)
|
|
||||||
else:
|
else:
|
||||||
break
|
break
|
||||||
|
|
||||||
@@ -1783,12 +1808,11 @@ class SharePointApp(wx.Frame):
|
|||||||
url = f"https://graph.microsoft.com/v1.0/drives/{data['drive_id']}/items/{data['id']}/children"
|
url = f"https://graph.microsoft.com/v1.0/drives/{data['drive_id']}/items/{data['id']}/children"
|
||||||
|
|
||||||
while url:
|
while url:
|
||||||
res = requests.get(url, headers=self.headers)
|
res = _graph_request("GET", url, headers=self.headers, timeout=30)
|
||||||
if res.status_code == 200:
|
if res.status_code == 200:
|
||||||
res_data = res.json()
|
res_data = res.json()
|
||||||
all_children.extend(res_data.get('value', []))
|
all_children.extend(res_data.get('value', []))
|
||||||
url = res_data.get('@odata.nextLink')
|
url = res_data.get('@odata.nextLink')
|
||||||
self.pulse_gauge(True)
|
|
||||||
else:
|
else:
|
||||||
break
|
break
|
||||||
|
|
||||||
@@ -1854,10 +1878,10 @@ class SharePointApp(wx.Frame):
|
|||||||
|
|
||||||
self._navigate_to_item_data(data, tree_item=item)
|
self._navigate_to_item_data(data, tree_item=item)
|
||||||
|
|
||||||
def _navigate_to_item_data(self, data, tree_item=None, is_breadcrumb=False):
|
def _navigate_to_item_data(self, data, tree_item=None):
|
||||||
try:
|
try:
|
||||||
# Race-condition beskyttelse: Hvis vi allerede er der, så stop (undtagen ved brødkrumme-klik)
|
# Race-condition beskyttelse: Hvis vi allerede er der, så stop
|
||||||
if not is_breadcrumb and getattr(self, 'current_path', None) == data.get("path"):
|
if getattr(self, 'current_path', None) == data.get("path"):
|
||||||
return
|
return
|
||||||
|
|
||||||
self.current_path = data["path"]
|
self.current_path = data["path"]
|
||||||
@@ -1887,11 +1911,13 @@ class SharePointApp(wx.Frame):
|
|||||||
self.current_items = []
|
self.current_items = []
|
||||||
self.set_status(self.get_txt("status_loading_content"))
|
self.set_status(self.get_txt("status_loading_content"))
|
||||||
|
|
||||||
threading.Thread(target=self._fetch_list_contents_bg, args=(data,), daemon=True).start()
|
self._nav_gen += 1
|
||||||
|
gen = self._nav_gen
|
||||||
|
threading.Thread(target=self._fetch_list_contents_bg, args=(data, gen), daemon=True).start()
|
||||||
except RuntimeError:
|
except RuntimeError:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
def _fetch_list_contents_bg(self, data):
|
def _fetch_list_contents_bg(self, data, nav_gen=None):
|
||||||
if not self.ensure_valid_token(): return
|
if not self.ensure_valid_token(): return
|
||||||
self.pulse_gauge(True)
|
self.pulse_gauge(True)
|
||||||
items_data = []
|
items_data = []
|
||||||
@@ -1905,7 +1931,7 @@ class SharePointApp(wx.Frame):
|
|||||||
|
|
||||||
first_chunk = True
|
first_chunk = True
|
||||||
while url:
|
while url:
|
||||||
res = requests.get(url, headers=self.headers)
|
res = _graph_request("GET", url, headers=self.headers, timeout=30)
|
||||||
if res.status_code != 200: break
|
if res.status_code != 200: break
|
||||||
|
|
||||||
res_data = res.json()
|
res_data = res.json()
|
||||||
@@ -1938,7 +1964,6 @@ class SharePointApp(wx.Frame):
|
|||||||
|
|
||||||
items_data.extend(chunk_data)
|
items_data.extend(chunk_data)
|
||||||
self.set_status(self.get_txt("status_loading_items").format(count=len(items_data)))
|
self.set_status(self.get_txt("status_loading_items").format(count=len(items_data)))
|
||||||
self.pulse_gauge(True)
|
|
||||||
|
|
||||||
# Chunked UI Update
|
# Chunked UI Update
|
||||||
if first_chunk:
|
if first_chunk:
|
||||||
@@ -1950,7 +1975,7 @@ class SharePointApp(wx.Frame):
|
|||||||
url = res_data.get('@odata.nextLink')
|
url = res_data.get('@odata.nextLink')
|
||||||
|
|
||||||
# Finalize
|
# Finalize
|
||||||
wx.CallAfter(self._finalize_list_loading, items_data)
|
wx.CallAfter(self._finalize_list_loading, items_data, nav_gen)
|
||||||
self.pulse_gauge(False)
|
self.pulse_gauge(False)
|
||||||
|
|
||||||
def _append_list_items(self, items):
|
def _append_list_items(self, items):
|
||||||
@@ -1973,8 +1998,10 @@ class SharePointApp(wx.Frame):
|
|||||||
self.list_ctrl.SetItem(idx, 2, size_str)
|
self.list_ctrl.SetItem(idx, 2, size_str)
|
||||||
self.list_ctrl.SetItem(idx, 3, item['modified'])
|
self.list_ctrl.SetItem(idx, 3, item['modified'])
|
||||||
|
|
||||||
def _finalize_list_loading(self, items_data):
|
def _finalize_list_loading(self, items_data, nav_gen=None):
|
||||||
if not self: return
|
if not self: return
|
||||||
|
if nav_gen is not None and nav_gen != self._nav_gen:
|
||||||
|
return # User navigated away; discard stale results
|
||||||
self.current_items = items_data
|
self.current_items = items_data
|
||||||
self.apply_sorting()
|
self.apply_sorting()
|
||||||
self.set_status(self.get_txt("status_ready"))
|
self.set_status(self.get_txt("status_ready"))
|
||||||
@@ -2071,7 +2098,7 @@ class SharePointApp(wx.Frame):
|
|||||||
idx = self.image_list.Add(bmp)
|
idx = self.image_list.Add(bmp)
|
||||||
self.ext_icons[ext] = idx
|
self.ext_icons[ext] = idx
|
||||||
return idx
|
return idx
|
||||||
except:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
self.ext_icons[ext] = self.idx_file
|
self.ext_icons[ext] = self.idx_file
|
||||||
@@ -2230,11 +2257,15 @@ class SharePointApp(wx.Frame):
|
|||||||
file_name = item['name']
|
file_name = item['name']
|
||||||
drive_id = item['drive_id']
|
drive_id = item['drive_id']
|
||||||
|
|
||||||
if item_id in self.active_edits:
|
with self._edits_lock:
|
||||||
|
already_editing = item_id in self.active_edits
|
||||||
|
at_limit = len(self.active_edits) >= 10
|
||||||
|
|
||||||
|
# UI dialogs are called outside the lock to avoid holding it during blocking calls
|
||||||
|
if already_editing:
|
||||||
self.show_info(f"'{file_name}' er allerede ved at blive redigeret.", wx.ICON_INFORMATION)
|
self.show_info(f"'{file_name}' er allerede ved at blive redigeret.", wx.ICON_INFORMATION)
|
||||||
return
|
return
|
||||||
|
if at_limit:
|
||||||
if len(self.active_edits) >= 10:
|
|
||||||
wx.MessageBox("Du kan kun have 10 filer åbne til redigering ad gangen.", "Maksimum grænse nået", wx.OK | wx.ICON_WARNING)
|
wx.MessageBox("Du kan kun have 10 filer åbne til redigering ad gangen.", "Maksimum grænse nået", wx.OK | wx.ICON_WARNING)
|
||||||
return
|
return
|
||||||
|
|
||||||
@@ -2271,7 +2302,8 @@ class SharePointApp(wx.Frame):
|
|||||||
if not self.ensure_valid_token(): return
|
if not self.ensure_valid_token(): return
|
||||||
|
|
||||||
edit_event = threading.Event()
|
edit_event = threading.Event()
|
||||||
self.active_edits[item_id] = {"name": file_name, "event": edit_event, "waiting": False}
|
with self._edits_lock:
|
||||||
|
self.active_edits[item_id] = {"name": file_name, "event": edit_event, "waiting": False}
|
||||||
self.update_edit_ui()
|
self.update_edit_ui()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@@ -2288,7 +2320,7 @@ class SharePointApp(wx.Frame):
|
|||||||
|
|
||||||
# 2. Download
|
# 2. Download
|
||||||
self.set_status(self.get_txt("msg_fetching_file", name=file_name))
|
self.set_status(self.get_txt("msg_fetching_file", name=file_name))
|
||||||
res = requests.get(f"{base_url}/content", headers=self.headers)
|
res = _graph_request("GET", f"{base_url}/content", headers=self.headers, timeout=30)
|
||||||
if res.status_code != 200:
|
if res.status_code != 200:
|
||||||
raise Exception(f"{self.get_txt('msg_unknown_error')}: {res.status_code}")
|
raise Exception(f"{self.get_txt('msg_unknown_error')}: {res.status_code}")
|
||||||
|
|
||||||
@@ -2325,7 +2357,7 @@ class SharePointApp(wx.Frame):
|
|||||||
|
|
||||||
# Checkout
|
# Checkout
|
||||||
is_checked_out = False
|
is_checked_out = False
|
||||||
checkout_res = requests.post(f"{base_url}/checkout", headers=self.headers)
|
checkout_res = _graph_request("POST", f"{base_url}/checkout", headers=self.headers, timeout=30)
|
||||||
if checkout_res.status_code in [200, 201, 204]:
|
if checkout_res.status_code in [200, 201, 204]:
|
||||||
is_checked_out = True
|
is_checked_out = True
|
||||||
logger.info(f"Fil {file_name} udtjekket succesfuldt.")
|
logger.info(f"Fil {file_name} udtjekket succesfuldt.")
|
||||||
@@ -2358,13 +2390,15 @@ class SharePointApp(wx.Frame):
|
|||||||
else:
|
else:
|
||||||
self.set_status(self.get_txt("msg_waiting_for_file", name=file_name))
|
self.set_status(self.get_txt("msg_waiting_for_file", name=file_name))
|
||||||
edit_event.clear()
|
edit_event.clear()
|
||||||
self.active_edits[item_id]["waiting"] = True
|
with self._edits_lock:
|
||||||
|
self.active_edits[item_id]["waiting"] = True
|
||||||
self.update_edit_ui()
|
self.update_edit_ui()
|
||||||
|
|
||||||
edit_event.wait()
|
edit_event.wait()
|
||||||
|
|
||||||
if item_id in self.active_edits:
|
with self._edits_lock:
|
||||||
self.active_edits[item_id]["waiting"] = False
|
if item_id in self.active_edits:
|
||||||
|
self.active_edits[item_id]["waiting"] = False
|
||||||
self.update_edit_ui()
|
self.update_edit_ui()
|
||||||
|
|
||||||
# 4. Tjek om noget er ændret
|
# 4. Tjek om noget er ændret
|
||||||
@@ -2385,7 +2419,7 @@ class SharePointApp(wx.Frame):
|
|||||||
|
|
||||||
if is_checked_out:
|
if is_checked_out:
|
||||||
logger.info(f"Annullerer udtjekning (discardCheckout) for {file_name}...")
|
logger.info(f"Annullerer udtjekning (discardCheckout) for {file_name}...")
|
||||||
res = requests.post(f"{base_url}/discardCheckout", headers=self.headers)
|
res = _graph_request("POST", f"{base_url}/discardCheckout", headers=self.headers, timeout=30)
|
||||||
if res.status_code in [200, 204]:
|
if res.status_code in [200, 204]:
|
||||||
is_checked_out = False
|
is_checked_out = False
|
||||||
else:
|
else:
|
||||||
@@ -2393,14 +2427,14 @@ class SharePointApp(wx.Frame):
|
|||||||
logger.info(f"Ændring fundet! Uploader {file_name}...")
|
logger.info(f"Ændring fundet! Uploader {file_name}...")
|
||||||
self.set_status(self.get_txt("msg_updating_changes"))
|
self.set_status(self.get_txt("msg_updating_changes"))
|
||||||
with open(local_path, 'rb') as f:
|
with open(local_path, 'rb') as f:
|
||||||
upload_res = requests.put(f"{base_url}/content", headers=self.headers, data=f)
|
upload_res = requests.put(f"{base_url}/content", headers=self.headers, data=f, timeout=120)
|
||||||
if upload_res.status_code not in [200, 201]:
|
if upload_res.status_code not in [200, 201]:
|
||||||
raise Exception(f"{self.get_txt('msg_update_failed_code', code=upload_res.status_code)}")
|
raise Exception(f"{self.get_txt('msg_update_failed_code', code=upload_res.status_code)}")
|
||||||
|
|
||||||
# 6. Checkin (Kun hvis vi faktisk uploadede noget)
|
# 6. Checkin (Kun hvis vi faktisk uploadede noget)
|
||||||
if is_checked_out:
|
if is_checked_out:
|
||||||
self.set_status(self.get_txt("msg_checking_in", name=file_name))
|
self.set_status(self.get_txt("msg_checking_in", name=file_name))
|
||||||
res = requests.post(f"{base_url}/checkin", headers=self.headers, json={"comment": "SP Explorer Edit"})
|
res = _graph_request("POST", f"{base_url}/checkin", headers=self.headers, json={"comment": "SP Explorer Edit"}, timeout=30)
|
||||||
if res.status_code in [200, 201, 204]:
|
if res.status_code in [200, 201, 204]:
|
||||||
is_checked_out = False
|
is_checked_out = False
|
||||||
|
|
||||||
@@ -2408,7 +2442,7 @@ class SharePointApp(wx.Frame):
|
|||||||
try:
|
try:
|
||||||
os.remove(local_path)
|
os.remove(local_path)
|
||||||
os.rmdir(working_dir)
|
os.rmdir(working_dir)
|
||||||
except:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
self.set_status(self.get_txt("msg_update_success", name=file_name))
|
self.set_status(self.get_txt("msg_update_success", name=file_name))
|
||||||
@@ -2422,10 +2456,11 @@ class SharePointApp(wx.Frame):
|
|||||||
if is_checked_out:
|
if is_checked_out:
|
||||||
# Emergency cleanup hvis vi stadig har fat i filen (f.eks. ved crash eller afbrydelse)
|
# Emergency cleanup hvis vi stadig har fat i filen (f.eks. ved crash eller afbrydelse)
|
||||||
logger.info(f"Rydder op: Kalder discardCheckout for {file_name}...")
|
logger.info(f"Rydder op: Kalder discardCheckout for {file_name}...")
|
||||||
requests.post(f"{base_url}/discardCheckout", headers=self.headers)
|
_graph_request("POST", f"{base_url}/discardCheckout", headers=self.headers, timeout=30)
|
||||||
|
|
||||||
if item_id in self.active_edits:
|
with self._edits_lock:
|
||||||
del self.active_edits[item_id]
|
if item_id in self.active_edits:
|
||||||
|
del self.active_edits[item_id]
|
||||||
self.update_edit_ui()
|
self.update_edit_ui()
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
|
|||||||
501
tests/test_review_fixes.py
Normal file
501
tests/test_review_fixes.py
Normal file
@@ -0,0 +1,501 @@
|
|||||||
|
"""
|
||||||
|
Unit tests for the code-review bug fixes applied to sharepoint_browser.py.
|
||||||
|
|
||||||
|
Covers:
|
||||||
|
C1 - url=None prevents UnboundLocalError in _fetch_tree_children_bg
|
||||||
|
I1 - nav_gen guard in _finalize_list_loading discards stale results
|
||||||
|
I2 - System tab label derivation from STRINGS key
|
||||||
|
I3 - status_loading_items translation key present and formattable
|
||||||
|
C-1 - Refresh calls (nav_gen=None default) always apply their results
|
||||||
|
S-2 - nav_gen=None sentinel is safer than 0
|
||||||
|
S-1 - is_breadcrumb parameter removed from _navigate_to_item_data
|
||||||
|
S2 - Dead SITE branch removed from _append_list_items
|
||||||
|
|
||||||
|
All tests run without a live display; wx is imported but no widgets are
|
||||||
|
instantiated.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
import inspect
|
||||||
|
import unittest
|
||||||
|
from unittest.mock import MagicMock, patch
|
||||||
|
|
||||||
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
import sharepoint_browser as sb
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# I2 + I3: STRINGS dictionary
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestStrings(unittest.TestCase):
|
||||||
|
|
||||||
|
def test_system_tab_label_da(self):
|
||||||
|
"""I2: Danish settings_logging_group yields 'System' after split/strip."""
|
||||||
|
group = sb.STRINGS["da"]["settings_logging_group"]
|
||||||
|
self.assertEqual(group.split("/")[0].strip(), "System")
|
||||||
|
|
||||||
|
def test_system_tab_label_en(self):
|
||||||
|
"""I2: English settings_logging_group yields 'System' after split/strip."""
|
||||||
|
group = sb.STRINGS["en"]["settings_logging_group"]
|
||||||
|
self.assertEqual(group.split("/")[0].strip(), "System")
|
||||||
|
|
||||||
|
def test_status_loading_items_present_da(self):
|
||||||
|
"""I3: status_loading_items key exists in Danish STRINGS."""
|
||||||
|
self.assertIn("status_loading_items", sb.STRINGS["da"])
|
||||||
|
|
||||||
|
def test_status_loading_items_present_en(self):
|
||||||
|
"""I3: status_loading_items key exists in English STRINGS."""
|
||||||
|
self.assertIn("status_loading_items", sb.STRINGS["en"])
|
||||||
|
|
||||||
|
def test_status_loading_items_format_da(self):
|
||||||
|
"""I3: Danish template formats with named {count} argument."""
|
||||||
|
result = sb.STRINGS["da"]["status_loading_items"].format(count=42)
|
||||||
|
self.assertIn("42", result)
|
||||||
|
|
||||||
|
def test_status_loading_items_format_en(self):
|
||||||
|
"""I3: English template formats with named {count} argument."""
|
||||||
|
result = sb.STRINGS["en"]["status_loading_items"].format(count=99)
|
||||||
|
self.assertIn("99", result)
|
||||||
|
|
||||||
|
def test_status_loading_items_da_uses_count_kwarg(self):
|
||||||
|
"""I3: Danish template uses {count} placeholder (not positional)."""
|
||||||
|
template = sb.STRINGS["da"]["status_loading_items"]
|
||||||
|
self.assertIn("{count}", template)
|
||||||
|
|
||||||
|
def test_status_loading_items_en_uses_count_kwarg(self):
|
||||||
|
"""I3: English template uses {count} placeholder (not positional)."""
|
||||||
|
template = sb.STRINGS["en"]["status_loading_items"]
|
||||||
|
self.assertIn("{count}", template)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# I1 + C-1 + S-2: nav_gen guard in _finalize_list_loading
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestNavGenGuard(unittest.TestCase):
|
||||||
|
"""
|
||||||
|
_finalize_list_loading(self, items_data, nav_gen=None)
|
||||||
|
|
||||||
|
Guard logic:
|
||||||
|
nav_gen is None → always apply (refresh / unconstrained calls)
|
||||||
|
nav_gen == self._nav_gen → apply (matches current navigation)
|
||||||
|
nav_gen != self._nav_gen → discard (stale; user navigated away)
|
||||||
|
"""
|
||||||
|
|
||||||
|
def _make_app(self, current_gen: int):
|
||||||
|
"""Minimal mock that satisfies _finalize_list_loading's needs."""
|
||||||
|
app = MagicMock()
|
||||||
|
app._nav_gen = current_gen
|
||||||
|
# MagicMock is truthy by default so `if not self` passes
|
||||||
|
return app
|
||||||
|
|
||||||
|
# --- nav_gen=None cases (C-1 / S-2) ---
|
||||||
|
|
||||||
|
def test_none_gen_applies_when_nav_gen_is_1(self):
|
||||||
|
"""C-1/S-2: nav_gen=None applies results regardless of _nav_gen."""
|
||||||
|
app = self._make_app(1)
|
||||||
|
items = [{"name": "a"}]
|
||||||
|
sb.SharePointApp._finalize_list_loading(app, items, nav_gen=None)
|
||||||
|
self.assertEqual(app.current_items, items)
|
||||||
|
app.apply_sorting.assert_called_once()
|
||||||
|
|
||||||
|
def test_none_gen_applies_when_nav_gen_is_high(self):
|
||||||
|
"""C-1: nav_gen=None still applies when _nav_gen is large."""
|
||||||
|
app = self._make_app(99)
|
||||||
|
items = [{"name": "b"}]
|
||||||
|
sb.SharePointApp._finalize_list_loading(app, items, nav_gen=None)
|
||||||
|
self.assertEqual(app.current_items, items)
|
||||||
|
|
||||||
|
def test_default_gen_arg_applies(self):
|
||||||
|
"""C-1: Omitting nav_gen entirely (default=None) always applies."""
|
||||||
|
app = self._make_app(5)
|
||||||
|
items = [{"name": "refresh"}]
|
||||||
|
sb.SharePointApp._finalize_list_loading(app, items) # no nav_gen kwarg
|
||||||
|
self.assertEqual(app.current_items, items)
|
||||||
|
app.apply_sorting.assert_called_once()
|
||||||
|
|
||||||
|
# --- matching gen (I1, happy path) ---
|
||||||
|
|
||||||
|
def test_matching_gen_applies(self):
|
||||||
|
"""I1: Results applied when nav_gen matches _nav_gen."""
|
||||||
|
app = self._make_app(3)
|
||||||
|
items = [{"name": "c"}]
|
||||||
|
sb.SharePointApp._finalize_list_loading(app, items, nav_gen=3)
|
||||||
|
self.assertEqual(app.current_items, items)
|
||||||
|
app.apply_sorting.assert_called_once()
|
||||||
|
|
||||||
|
# --- stale gen cases (I1) ---
|
||||||
|
|
||||||
|
def test_stale_gen_discards_results(self):
|
||||||
|
"""I1: Results discarded when nav_gen < _nav_gen (user navigated away)."""
|
||||||
|
app = self._make_app(5)
|
||||||
|
sentinel = object()
|
||||||
|
app.current_items = sentinel
|
||||||
|
sb.SharePointApp._finalize_list_loading(app, [{"name": "old"}], nav_gen=2)
|
||||||
|
self.assertIs(app.current_items, sentinel,
|
||||||
|
"current_items was overwritten by stale result")
|
||||||
|
app.apply_sorting.assert_not_called()
|
||||||
|
|
||||||
|
def test_future_gen_discards_results(self):
|
||||||
|
"""I1: Results discarded when nav_gen > _nav_gen (shouldn't happen, but safe)."""
|
||||||
|
app = self._make_app(3)
|
||||||
|
sentinel = object()
|
||||||
|
app.current_items = sentinel
|
||||||
|
sb.SharePointApp._finalize_list_loading(app, [{"name": "future"}], nav_gen=99)
|
||||||
|
self.assertIs(app.current_items, sentinel)
|
||||||
|
app.apply_sorting.assert_not_called()
|
||||||
|
|
||||||
|
def test_gen_zero_still_discarded_when_nav_gen_nonzero(self):
|
||||||
|
"""S-2: nav_gen=0 (old broken default) is now treated as a stale gen, not a pass."""
|
||||||
|
app = self._make_app(1)
|
||||||
|
sentinel = object()
|
||||||
|
app.current_items = sentinel
|
||||||
|
sb.SharePointApp._finalize_list_loading(app, [{"name": "zero"}], nav_gen=0)
|
||||||
|
self.assertIs(app.current_items, sentinel,
|
||||||
|
"nav_gen=0 should be treated as stale, not as 'no constraint'")
|
||||||
|
app.apply_sorting.assert_not_called()
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# C1: url=None initialization in _fetch_tree_children_bg
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestUrlInitialization(unittest.TestCase):
|
||||||
|
|
||||||
|
def test_url_initialized_to_none_before_conditional(self):
|
||||||
|
"""C1: _fetch_tree_children_bg initializes url=None before the if block."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp._fetch_tree_children_bg)
|
||||||
|
# Find the line that sets url = None before the if data['type'] conditional
|
||||||
|
lines = source.splitlines()
|
||||||
|
url_none_idx = next(
|
||||||
|
(i for i, l in enumerate(lines) if "url = None" in l), None
|
||||||
|
)
|
||||||
|
site_if_idx = next(
|
||||||
|
(i for i, l in enumerate(lines)
|
||||||
|
if 'data[\'type\'] == "SITE"' in l or 'data["type"] == "SITE"' in l),
|
||||||
|
None
|
||||||
|
)
|
||||||
|
self.assertIsNotNone(url_none_idx,
|
||||||
|
"_fetch_tree_children_bg has no 'url = None' initializer")
|
||||||
|
self.assertIsNotNone(site_if_idx,
|
||||||
|
"_fetch_tree_children_bg has no SITE type conditional")
|
||||||
|
self.assertLess(url_none_idx, site_if_idx,
|
||||||
|
"'url = None' must appear before the if data['type'] block")
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# S2: Dead SITE branch removed from _append_list_items
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestDeadSiteBranch(unittest.TestCase):
|
||||||
|
|
||||||
|
def test_append_list_items_has_no_site_img_branch(self):
|
||||||
|
"""S2: _append_list_items no longer contains the dead SITE image branch."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp._append_list_items)
|
||||||
|
self.assertNotIn(
|
||||||
|
'item[\'type\'] == "SITE"',
|
||||||
|
source,
|
||||||
|
"_append_list_items still contains dead SITE branch"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# S-1: is_breadcrumb parameter removed from _navigate_to_item_data
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestIsBreakcrumbRemoved(unittest.TestCase):
|
||||||
|
|
||||||
|
def test_no_is_breadcrumb_param(self):
|
||||||
|
"""S-1: _navigate_to_item_data no longer has an is_breadcrumb parameter."""
|
||||||
|
sig = inspect.signature(sb.SharePointApp._navigate_to_item_data)
|
||||||
|
self.assertNotIn(
|
||||||
|
"is_breadcrumb", sig.parameters,
|
||||||
|
"_navigate_to_item_data still declares is_breadcrumb"
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_tree_item_param_still_present(self):
|
||||||
|
"""Regression: tree_item parameter was not accidentally removed."""
|
||||||
|
sig = inspect.signature(sb.SharePointApp._navigate_to_item_data)
|
||||||
|
self.assertIn("tree_item", sig.parameters)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Task 7: bare except → except Exception, print → logger, basename sanitization
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestTask7BareExcept(unittest.TestCase):
|
||||||
|
"""Verify that bare except: clauses have been replaced with except Exception:."""
|
||||||
|
|
||||||
|
def _get_source_lines(self):
|
||||||
|
return inspect.getsource(sb).splitlines()
|
||||||
|
|
||||||
|
def test_no_bare_except_in_load_settings(self):
|
||||||
|
"""7a: load_settings() uses except Exception, not bare except."""
|
||||||
|
source = inspect.getsource(sb.load_settings)
|
||||||
|
self.assertNotIn("except:", source,
|
||||||
|
"load_settings still has a bare except:")
|
||||||
|
|
||||||
|
def test_no_bare_except_in_get_txt(self):
|
||||||
|
"""7a: get_txt() uses except Exception, not bare except."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp.get_txt)
|
||||||
|
self.assertNotIn("except:", source,
|
||||||
|
"get_txt still has a bare except:")
|
||||||
|
|
||||||
|
def test_no_bare_except_in_get_icon_idx(self):
|
||||||
|
"""7a: get_icon_idx_for_file() uses except Exception, not bare except."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp.get_icon_idx_for_file)
|
||||||
|
self.assertNotIn("except:", source,
|
||||||
|
"get_icon_idx_for_file still has a bare except:")
|
||||||
|
|
||||||
|
def test_no_bare_except_in_process_file(self):
|
||||||
|
"""7a: process_file() cleanup block uses except Exception, not bare except."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp.process_file)
|
||||||
|
self.assertNotIn("except:", source,
|
||||||
|
"process_file still has a bare except: in cleanup block")
|
||||||
|
|
||||||
|
def test_no_print_in_msal_init(self):
|
||||||
|
"""7b: MSAL init error uses logger, not print()."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp.__init__)
|
||||||
|
self.assertNotIn('print(f"MSAL Init Error', source,
|
||||||
|
"__init__ still uses print() for MSAL Init Error")
|
||||||
|
|
||||||
|
def test_no_print_in_ensure_valid_token(self):
|
||||||
|
"""7b: ensure_valid_token() uses logger, not print()."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp.ensure_valid_token)
|
||||||
|
self.assertNotIn('print(f"Token refresh error', source,
|
||||||
|
"ensure_valid_token still uses print() for token error")
|
||||||
|
|
||||||
|
def test_basename_in_download_folder_recursive(self):
|
||||||
|
"""7c: _download_folder_recursive_sync uses os.path.basename on item name."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp._download_folder_recursive_sync)
|
||||||
|
self.assertIn("os.path.basename", source,
|
||||||
|
"_download_folder_recursive_sync does not sanitize item['name'] with basename")
|
||||||
|
|
||||||
|
def test_basename_in_on_download_clicked(self):
|
||||||
|
"""7c: on_download_clicked uses os.path.basename when building dest_path for folders."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp.on_download_clicked)
|
||||||
|
self.assertIn("os.path.basename", source,
|
||||||
|
"on_download_clicked does not sanitize item['name'] with basename for folder downloads")
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Task 8: all requests.* calls must carry a timeout= parameter
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestNetworkTimeouts(unittest.TestCase):
|
||||||
|
"""
|
||||||
|
Every requests.get/post/put/patch/delete call must include timeout=
|
||||||
|
so that background threads cannot hang indefinitely on a stalled network.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def _calls_missing_timeout(self):
|
||||||
|
"""
|
||||||
|
Return a list of (line_number, line_text) for every requests.*()
|
||||||
|
call in the source file that is missing a timeout= argument.
|
||||||
|
|
||||||
|
Strategy: single-line calls are the norm in this codebase. We look
|
||||||
|
for lines that contain 'requests.METHOD(' and do NOT also contain
|
||||||
|
'timeout='. Lines that contain 'timeout=' anywhere on the same line
|
||||||
|
are considered compliant.
|
||||||
|
"""
|
||||||
|
import re
|
||||||
|
src_path = os.path.join(
|
||||||
|
os.path.dirname(os.path.dirname(os.path.abspath(__file__))),
|
||||||
|
"sharepoint_browser.py"
|
||||||
|
)
|
||||||
|
# Matches real call sites: assignment or standalone call (not docstrings)
|
||||||
|
pattern = re.compile(
|
||||||
|
r'(=\s*|^\s*)requests\.(get|post|put|patch|delete)\('
|
||||||
|
)
|
||||||
|
missing = []
|
||||||
|
with open(src_path, encoding='utf-8') as fh:
|
||||||
|
for lineno, line in enumerate(fh, 1):
|
||||||
|
stripped = line.lstrip()
|
||||||
|
# Skip comment lines and docstring prose (lines that are plain text)
|
||||||
|
if stripped.startswith('#'):
|
||||||
|
continue
|
||||||
|
if pattern.search(line) and 'timeout=' not in line:
|
||||||
|
missing.append((lineno, line.rstrip()))
|
||||||
|
return missing
|
||||||
|
|
||||||
|
def test_all_requests_calls_have_timeout(self):
|
||||||
|
"""Task 8: No requests.* call is missing a timeout= parameter."""
|
||||||
|
missing = self._calls_missing_timeout()
|
||||||
|
if missing:
|
||||||
|
details = "\n".join(f" line {n}: {txt}" for n, txt in missing)
|
||||||
|
self.fail(
|
||||||
|
f"{len(missing)} requests call(s) are missing timeout=:\n{details}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Task 9: _graph_request helper — retry on 429/503 with Retry-After support
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestGraphRequest(unittest.TestCase):
|
||||||
|
"""Unit tests for the _graph_request() module-level helper."""
|
||||||
|
|
||||||
|
def test_helper_exists(self):
|
||||||
|
"""Task 9: _graph_request function is defined at module level."""
|
||||||
|
self.assertTrue(
|
||||||
|
callable(getattr(sb, "_graph_request", None)),
|
||||||
|
"_graph_request is not defined in sharepoint_browser"
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_success_on_first_attempt(self):
|
||||||
|
"""Task 9: Returns immediately when first response is 200."""
|
||||||
|
mock_resp = MagicMock()
|
||||||
|
mock_resp.status_code = 200
|
||||||
|
with patch("sharepoint_browser.requests.request", return_value=mock_resp) as mock_req:
|
||||||
|
result = sb._graph_request("GET", "https://example.com/", headers={})
|
||||||
|
self.assertEqual(result.status_code, 200)
|
||||||
|
self.assertEqual(mock_req.call_count, 1)
|
||||||
|
|
||||||
|
def test_retries_on_429(self):
|
||||||
|
"""Task 9: Retries when response is 429 (rate limited)."""
|
||||||
|
responses = [
|
||||||
|
MagicMock(status_code=429, headers={"Retry-After": "0"}),
|
||||||
|
MagicMock(status_code=429, headers={"Retry-After": "0"}),
|
||||||
|
MagicMock(status_code=200, headers={}),
|
||||||
|
]
|
||||||
|
with patch("sharepoint_browser.requests.request", side_effect=responses) as mock_req:
|
||||||
|
with patch("sharepoint_browser.time.sleep"):
|
||||||
|
result = sb._graph_request("GET", "https://example.com/", headers={})
|
||||||
|
self.assertEqual(result.status_code, 200)
|
||||||
|
self.assertEqual(mock_req.call_count, 3)
|
||||||
|
|
||||||
|
def test_retries_on_503(self):
|
||||||
|
"""Task 9: Retries when response is 503 (service unavailable)."""
|
||||||
|
responses = [
|
||||||
|
MagicMock(status_code=503, headers={}),
|
||||||
|
MagicMock(status_code=200, headers={}),
|
||||||
|
]
|
||||||
|
with patch("sharepoint_browser.requests.request", side_effect=responses) as mock_req:
|
||||||
|
with patch("sharepoint_browser.time.sleep"):
|
||||||
|
result = sb._graph_request("POST", "https://example.com/", headers={})
|
||||||
|
self.assertEqual(result.status_code, 200)
|
||||||
|
self.assertEqual(mock_req.call_count, 2)
|
||||||
|
|
||||||
|
def test_returns_last_response_after_max_retries(self):
|
||||||
|
"""Task 9: Returns last 429 response when all retries are exhausted."""
|
||||||
|
resp_429 = MagicMock(status_code=429, headers={"Retry-After": "0"})
|
||||||
|
responses = [resp_429] * sb._MAX_RETRIES
|
||||||
|
with patch("sharepoint_browser.requests.request", side_effect=responses):
|
||||||
|
with patch("sharepoint_browser.time.sleep"):
|
||||||
|
result = sb._graph_request("GET", "https://example.com/", headers={})
|
||||||
|
self.assertEqual(result.status_code, 429)
|
||||||
|
|
||||||
|
def test_respects_retry_after_header(self):
|
||||||
|
"""Task 9: sleep() is called with the Retry-After value from the response."""
|
||||||
|
responses = [
|
||||||
|
MagicMock(status_code=429, headers={"Retry-After": "5"}),
|
||||||
|
MagicMock(status_code=200, headers={}),
|
||||||
|
]
|
||||||
|
with patch("sharepoint_browser.requests.request", side_effect=responses):
|
||||||
|
with patch("sharepoint_browser.time.sleep") as mock_sleep:
|
||||||
|
sb._graph_request("GET", "https://example.com/", headers={})
|
||||||
|
mock_sleep.assert_called_once_with(5)
|
||||||
|
|
||||||
|
def test_default_timeout_injected(self):
|
||||||
|
"""Task 9: timeout=30 is injected when caller does not provide one."""
|
||||||
|
mock_resp = MagicMock(status_code=200, headers={})
|
||||||
|
with patch("sharepoint_browser.requests.request", return_value=mock_resp) as mock_req:
|
||||||
|
sb._graph_request("GET", "https://example.com/", headers={})
|
||||||
|
_, kwargs = mock_req.call_args
|
||||||
|
self.assertEqual(kwargs.get("timeout"), 30)
|
||||||
|
|
||||||
|
def test_caller_timeout_not_overridden(self):
|
||||||
|
"""Task 9: Explicit timeout from caller is not overwritten by the helper."""
|
||||||
|
mock_resp = MagicMock(status_code=200, headers={})
|
||||||
|
with patch("sharepoint_browser.requests.request", return_value=mock_resp) as mock_req:
|
||||||
|
sb._graph_request("GET", "https://example.com/", headers={}, timeout=60)
|
||||||
|
_, kwargs = mock_req.call_args
|
||||||
|
self.assertEqual(kwargs.get("timeout"), 60)
|
||||||
|
|
||||||
|
def test_max_retries_constant_exists(self):
|
||||||
|
"""Task 9: _MAX_RETRIES constant is defined."""
|
||||||
|
self.assertTrue(
|
||||||
|
hasattr(sb, "_MAX_RETRIES"),
|
||||||
|
"_MAX_RETRIES constant not found in sharepoint_browser"
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_no_sleep_after_final_retry(self):
|
||||||
|
"""Fix #1: sleep() is NOT called after the last exhausted attempt."""
|
||||||
|
resp_429 = MagicMock(status_code=429, headers={"Retry-After": "0"})
|
||||||
|
responses = [resp_429] * sb._MAX_RETRIES
|
||||||
|
with patch("sharepoint_browser.requests.request", side_effect=responses):
|
||||||
|
with patch("sharepoint_browser.time.sleep") as mock_sleep:
|
||||||
|
sb._graph_request("GET", "https://example.com/", headers={})
|
||||||
|
# sleep should be called for the first N-1 failures, NOT the last one
|
||||||
|
self.assertEqual(
|
||||||
|
mock_sleep.call_count,
|
||||||
|
sb._MAX_RETRIES - 1,
|
||||||
|
f"sleep() called {mock_sleep.call_count} times; expected {sb._MAX_RETRIES - 1} "
|
||||||
|
f"(no sleep after the final failed attempt)"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Task 10: threading.Lock for active_edits compound operations
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestActiveEditsLock(unittest.TestCase):
|
||||||
|
"""Verify that _edits_lock exists and guards all compound active_edits operations."""
|
||||||
|
|
||||||
|
def test_edits_lock_declared_in_init(self):
|
||||||
|
"""Task 10: SharePointApp.__init__ creates self._edits_lock."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp.__init__)
|
||||||
|
self.assertIn("_edits_lock", source,
|
||||||
|
"__init__ does not declare _edits_lock")
|
||||||
|
|
||||||
|
def test_edits_lock_is_threading_lock(self):
|
||||||
|
"""Task 10: _edits_lock initialisation uses threading.Lock()."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp.__init__)
|
||||||
|
self.assertIn("threading.Lock()", source,
|
||||||
|
"__init__ does not initialise _edits_lock with threading.Lock()")
|
||||||
|
|
||||||
|
def test_open_file_uses_lock(self):
|
||||||
|
"""Task 10: open_file() acquires _edits_lock before checking active_edits."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp.open_file)
|
||||||
|
self.assertIn("_edits_lock", source,
|
||||||
|
"open_file does not use _edits_lock")
|
||||||
|
|
||||||
|
def test_process_file_uses_lock(self):
|
||||||
|
"""Task 10: process_file() acquires _edits_lock when writing active_edits."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp.process_file)
|
||||||
|
self.assertIn("_edits_lock", source,
|
||||||
|
"process_file does not use _edits_lock")
|
||||||
|
|
||||||
|
def test_process_file_lock_on_initial_assign(self):
|
||||||
|
"""Task 10: active_edits[item_id] = ... assignment is inside a lock block."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp.process_file)
|
||||||
|
# Check lock wraps the initial dict assignment
|
||||||
|
lock_idx = source.find("_edits_lock")
|
||||||
|
assign_idx = source.find('self.active_edits[item_id] = ')
|
||||||
|
self.assertGreater(assign_idx, 0,
|
||||||
|
"active_edits assignment not found in process_file")
|
||||||
|
# The lock must appear before the assignment
|
||||||
|
self.assertLess(lock_idx, assign_idx,
|
||||||
|
"_edits_lock must appear before the active_edits assignment in process_file")
|
||||||
|
|
||||||
|
def test_process_file_lock_on_delete(self):
|
||||||
|
"""Task 10: del active_edits[item_id] is inside a lock block in process_file."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp.process_file)
|
||||||
|
self.assertIn("del self.active_edits[item_id]", source,
|
||||||
|
"del active_edits[item_id] not found in process_file")
|
||||||
|
# Count lock usages — there should be at least 2
|
||||||
|
lock_count = source.count("_edits_lock")
|
||||||
|
self.assertGreaterEqual(lock_count, 2,
|
||||||
|
f"Expected at least 2 uses of _edits_lock in process_file, found {lock_count}")
|
||||||
|
|
||||||
|
def test_on_done_editing_uses_lock(self):
|
||||||
|
"""Task 10: on_done_editing_clicked acquires lock for active_edits iteration."""
|
||||||
|
source = inspect.getsource(sb.SharePointApp.on_done_editing_clicked)
|
||||||
|
self.assertIn("_edits_lock", source,
|
||||||
|
"on_done_editing_clicked does not use _edits_lock")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
unittest.main(verbosity=2)
|
||||||
Reference in New Issue
Block a user