Skip to content

  • Projects
  • Groups
  • Snippets
  • Help
    • Loading...
    • Help
    • Support
    • Submit feedback
    • Contribute to GitLab
  • Sign in
D
darshan
  • Project overview
    • Project overview
    • Details
    • Activity
    • Releases
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 72
    • Issues 72
    • List
    • Boards
    • Labels
    • Milestones
  • Merge Requests 5
    • Merge Requests 5
  • Analytics
    • Analytics
    • Repository
    • Value Stream
  • Wiki
    • Wiki
  • Members
    • Members
  • Collapse sidebar
  • Activity
  • Graph
  • Create a new issue
  • Commits
  • Issue Boards
  • darshan
  • darshan
  • Issues
  • #48

Closed
Open
Opened Sep 24, 2015 by Shane Snyder@ssnyder
  • Report abuse
  • New issue
Report abuse New issue

track per-process information in addition to per-file information

This ticket would be best done after modularizing the log format (see #46 (closed)). In addition to storing per-file records, we could store summary record per process (but spanning files) as well.

For example, we might want to know the total amount of time a process spent doing metadata, reads, and writes, as well as the number of bytes it read and wrote, regardless of how many files it opened or how many threads it used.

We could use this per-process data (in conjunction with a reduction step) to produce an immediate performance estimate without much post processing.

This also depends on accurate thread accounting in #81 (closed).

Assignee
Assign to
triage-feature-request
Milestone
triage-feature-request
Assign milestone
Time tracking
None
Due date
None
2
Labels
defect wrapper libraries
Assign labels
  • View project labels
Reference: darshan/darshan#48