Internal Portal for Consolidating Preliminary Furniture Service Estimates
Brief Description
Developed an internal web tool for employees that gathered separate estimate files for a project in one place, allowed them to be uploaded to the system, launched consolidation, and provided a single final file for further work. The project covered the full workflow: system login, file storage, processing launch, control of uploaded documents, and result output.
The project evolved through versions:
v1- an early version as a specialized page for batch file uploading, processing launch, and obtaining a summary result.v2- a more mature version as an internal dashboard with authorization, separate user workspaces, a file registry, improved interface, and the launch of a new processing algorithm.
Purpose of the Project
The project was needed for the internal consolidation of preliminary estimates for furniture services. It simplified the work for employees who needed to gather several separate estimates for an order, submit them for processing, and quickly receive a single result without manual work with a large number of files.
The system:
- accepted estimate files from employees;
- stored them within a user session and work set;
- provided an interface for managing uploaded files;
- launched server-side processing;
- provided a finished consolidation file for download.
Functions of the Web Project
- Implemented employee authorization and a separate login for the internal portal.
- Displayed a personal user screen with access to their uploaded files.
- Supported batch file uploading via web interface with client-side format and size validation.
- Uploaded files sequentially with progress and status display for each document.
- Maintained a registry of already uploaded files and showed the number of documents in the work set.
- Allowed for deleting individual files from the set and clearing the registry completely.
- Separated user data through personal server directories for uploaded files and processing results.
- Provided the ability to launch processing from the interface without manual server intervention.
- Supported switching between old and new processing algorithms.
- Passed user processing parameters, such as duplicate skipping mode.
- Published a link to the finished final file for download after processing was complete.
Web Implementation Highlights
- Implemented the web part on PHP with session-based authorization and a discrete entry point for the internal portal.
- Divided scenarios into separate handlers: login, file upload, deletion, retrieving the list of uploaded documents, processing launch, result download, and logout.
- Built a user scenario around file processing: from employee login to receiving the final consolidation file.
- Configured separate user workflows through personal server directories for uploaded files and results.
- Implemented batch file uploading via AJAX with client-side validation, progress indicators, statuses, and sequential document transmission.
- Added an interface registry for uploaded files with the ability to delete specific items or clear the entire work set.
- Linked the web interface with the server-side processor via asynchronous launch, allowing users to start processing directly from the portal without manual terminal actions.
- Added passing of processing parameters from the interface to the backend, including duplicate skipping mode.
- Supported switching between the old and new processing circuits, allowing the system to evolve without abandoning the previous scenario.
- Protected internal handlers from direct access and restricted them to session-based internal requests.
- Implemented result output as a distinct step in the user scenario: after processing, the system published a link to the final file for immediate download.
Post-Processing Logic
- The system launched server-side consolidation of data from the uploaded estimate files.
- Supported two processing circuits: an old PHP algorithm and a new Python algorithm.
- Validated input files against the expected work template before processing.
- Gathered data from multiple files into a single summary result for further use by employees.
- Merged repetitive items in the final table and supported a separate mode for skipping duplicates within a single estimate.
- Generated a final
.xlsxfile with the consolidated summary result. - Provided a summary of processed files, including the number of items found and total amounts.
- In the advanced version, provided more detailed results for auditing discrepancies and monitoring input data quality.
Python Implementation Highlights
- Implemented a separate Python processor launched from the web interface that worked with the user's uploaded file directory.
- Configured separate input and output directories per user, including clearing previous results before new processing.
- Built a processing pipeline using
pandas,xlrd, andopenpyxlto separate data reading, aggregation, and final file assembly. - Added normalization for rows and material names: cleaning control characters, whitespace trimming, Unicode string alignment, and name stabilization for subsequent comparison.
- Implemented logic for adapting to variable estimate table structures: the processor was not hardcoded to a single fixed column layout and could adjust to shifted headers and different field arrangements.
- Built an internal data map per material: group, unit of measure, price, coefficient, calculated quantity, manual quantity, actual quantity, and source coordinates.
- Added a
skip_duplicatesmode that allowed for excluding repeating items within a single estimate to avoid inflating the total during consolidation. - Implemented material merging based on a composite key:
name + unit of measureto collect results from several files into one summary entry. - Maintained a trace of the origin for each item: which file it came from, count of occurrences, and its original coordinates. This transformed the final file into an audit tool rather than just a summary table.
- Added logic for cases where material is present in actual data but missing from the main estimate: such items were not lost and were marked separately in the result.
- Generated the final
all_materials.xlsxviapandas, then performed a second post-processing phase usingopenpyxl. - In the second phase, generated formulas for total cost, cost in product, currency formats, column auto-width, text wrapping, auto-filters, and total sums.
- Implemented diagnostic row highlighting for several scenarios: duplicate materials in one file, quantity discrepancies, price differences, coefficient differences, and items missing from the main estimate.
- Added automatic cell comments so that reason for discrepancy and data source could be seen directly in the final
.xlsxwithout searching through source files. - Formed a separate highlighting legend in the resulting Excel so the outcome was clear to employees without additional explanation from the developer.
User Workflow
- The employee logged into the internal portal.
- Uploaded a batch of estimate files via the web form.
- Viewed the list of already uploaded documents and could remove unnecessary ones.
- Launched the desired processing variant.
- Waited for the server process to complete.
- Downloaded the new final consolidation file.
Evolution of Versions
v1
The first version was closer to a specialized utility:
- simple page for the consolidation task;
- batch file upload;
- processing launch and final result generation;
- output of intermediate processing information;
- grouping of matching and single items in the result.
v2
In the second version, the project became a proper internal portal:
- separate login screen;
- session-based user authorization;
- files stored separately by user;
- registry of uploaded files;
- sequential uploading with progress and a more user-friendly interface;
- separate actions for deleting files and clearing the set;
- launch of the new algorithm built directly into the interface via asynchronous request;
- processing result provided as a clear next step in the user scenario;
- a more mature layer for quality control of the final consolidation.
Technologies
- PHP
- JavaScript / jQuery
- AJAX
- HTML / CSS
- Python as server-side processing engine
- Session-based authorization
- File-based storage for work sets
My role
My role was holistic: from understanding the internal employee workflow and designing the interface to implementing the internal portal, file upload logic, server-side handlers, linking the web part with the processing engine, and evolving the project from an early utility to a more mature internal tool.
Practical value of the case
This case demonstrates experience in:
- developing internal portals and administrative interfaces;
- designing application workflows around file processing;
- organizing batch file uploads via web interface;
- integrating web interfaces with server-side processing and final file generation;
- building user scenarios from system login to obtaining a result;
- evolving a product through versions based on real employee workflows.